View abstract

Session III.4 - Foundations of Numerical PDEs

Monday, June 19, 18:00 ~ 18:30

Mean field theory in Inverse Problems: from Bayesian sampling to overparameterization of networks

Qin Li

University of Wisconsin-Madison, USA   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

Bayesian sampling and neural networks are seemingly two different machine learning areas, but they both deal with systems with many particles. In sampling, one evolves a large number of samples (particles) to match a target distribution function, and in optimizing over-parameterized neural networks, one can view neurons particles that feed each other information in the DNN flow. These perspectives allow us to employ mean-field theory, a powerful tool that translates dynamics of many particle system into a partial differential equation (PDE), so rich PDE analysis techniques can be used to understand both the convergence of sampling methods and the zero-loss property of over-parameterization of ResNets. I would like to showcase the use of mean-field theory in these two machine learning areas, and I'd also love to hear feedbacks from the audience on other possible applications.

Joint work with Shi Chen (University of Wisconsin-Madison), Zhiyan Ding (University of California-Berkeley) and Steve Wright (University of Wisconsin-Madison).

View abstract PDF