[Statlist] Reminder: ETH Young Data Science Researcher Seminar Zurich, Virtual Seminar by Hongzhou Lin, MIT (Massachusetts Institute of Technology), 18 September 2020

Maurer Letizia |et|z|@m@urer @end|ng |rom ethz@ch
Thu Sep 17 07:34:56 CEST 2020


Dear all

We are glad to announce the following talk in the virtual ETH Young Data Science Researcher Seminar Zurich

"Stochastic Optimization with Non-stationary Noise "  
by Hongzhou Lin, MIT (Massachusetts Institute of Technology)

Time: Friday, 18 September 2020, 15:00-​16:00
Place: Zoom at https://ethz.zoom.us/j/92367940258

Abstract: We investigate stochastic optimization problems under relaxed assumptions on the distribution of noise that are motivated by empirical observations in neural network training. Standard results on optimal convergence rates for stochastic optimization assume either there exists a uniform bound on the moments of the gradient noise, or that the noise decays as the algorithm progresses. These assumptions do not match the empirical behavior of optimization algorithms used in neural network training where the noise level in stochastic gradients could even increase with time. We address this behavior by studying convergence rates of stochastic gradient methods subject to changing second moment (or variance) of the stochastic oracle as the iterations progress. When the variation in the noise is known, we show that it is always beneficial to adapt the step-size and exploit the noise variability. When the noise statistics are unknown, we obtain similar improvements by developing an online estimator of the noise level, thereby recovering close variants of RMSProp. Consequently, our results reveal an important scenario where adaptive stepsize methods outperform SGD.

Best wishes,

M. Löffler, A. Taeb, Y. Chen

Seminar website: https://math.ethz.ch/sfs/news-and-events/young-data-science.html


More information about the Statlist mailing list