Session I.7 - Stochastic Computation
Wednesday, June 14, 18:00 ~ 18:30
Adaptive stochastic optimizers, neural nets and diffusion generative models
Sotirios Sabanis
University of Edinburgh, National Technical University of Athens and The Alan Turing Institute, UK - This email address is being protected from spambots. You need JavaScript enabled to view it.
A new class of stochastic (adaptive) optimization algorithms with superior performance in the training of artificial neural networks, has emerged recently due to the fundamental progress in the theory of numerical methods for SDEs with superlinear coefficients in recent years. Such stochastic optimizers can successfully address known shortcomings in the training of neural networks, which are known as 'exploding' and 'vanishing' gradients, both theoretically and practically.
Key findings of this new methodology will be reviewed and their links to diffusion generative models will be highlighted.