Stochastic Gradient Descent and Its Variants in Machine Learning

Praneeth Netrapalli

Abstract


Stochastic gradient descent (SGD) is a fundamental algorithm
which has had a profound impact on machine learning. This article
surveys some important results on SGD and its variants that arose in
machine learning.


Keywords


Stochastic optimization, Gradient descent, Large scale optimization

Full Text:

PDF

Refbacks

  • There are currently no refbacks.