Stochastic Gradient Descent and Its Variants in Machine Learning
Abstract
Stochastic gradient descent (SGD) is a fundamental algorithm
which has had a profound impact on machine learning. This article
surveys some important results on SGD and its variants that arose in
machine learning.
Keywords
Stochastic optimization, Gradient descent, Large scale optimization
Full Text:
PDFRefbacks
- There are currently no refbacks.