Mini-batch stochastic gradient descent with dynamic sample sizes
Dr. Michael R. Metel (Universite Paris-Sud)
In this presentation we will look at stochastic gradient descent and its variants, with a particular focus on the mini-batch implementation, for use in solving constrained convex optimization problems. Basic convergence results will be reviewed as well as an overview of the current research done to improve its use in practice. New dynamic sample size rules will be presented which ensure a descent direction with high probability. Their performance will be compared to fixed sample implementations in a financial application.
|Date||February 15, 2018 (Thu) 14:00 - 14:50|