[Statlist] FDS talk (zoom) with Dmitry Yarotsky - 11 November 2021, 16:15-17:15 CET

Maurer Letizia |et|z|@m@urer @end|ng |rom ethz@ch
Thu Oct 28 19:26:13 CEST 2021


We are pleased to announce the following online talk in our ETH Foundations of Data Science Seminar series 

"Explicit loss asymptotics in the gradient descent training of neural networks“ 
by Dmitry Yarotsky, Skolkovo Institute of Science and Technology (Skoltech), Moscow.

Date and time: Thursday, 11 November 2021, 16.15 - 17.15 CET
Place: Zoom at https://ethz.zoom.us/j/67996614965
Meeting ID: 679 9661 4965

Abstract: We show that the learning trajectory of a wide neural network in a lazy training regime can be described by an explicit asymptotic formula at large training times. Specifically, the leading term in the asymptotic expansion of the loss behaves as a power law $L(t) \sim C t^{-\xi}$ with exponent $\xi$ expressed only through the data dimension, the smoothness of the activation function, and the class of function being approximated. The constant C can also be found analytically. Our results are based on spectral analysis of the integral NTK operator. Importantly, the techniques we employ do not require a specific form of the data distribution, for example Gaussian, thus making our findings sufficiently universal. This is joint work with M. Velikanov.

Host: Helmut Bölcskei


Organisers: A. Bandeira, H. Bölcskei, P. Bühlmann, F. Yang

Seminar website: https://math.ethz.ch/sfs/news-and-events/data-science-seminar.html




More information about the Statlist mailing list