Session I.6 - Mathematical Foundations of Data Assimilation and Inverse Problems
Monday, June 12, 14:00 ~ 14:30
Gradient-based dimension reduction for solving Bayesian inverse problems
Ricardo Baptista
California Institute of Technology, USA - This email address is being protected from spambots. You need JavaScript enabled to view it.
Computational Bayesian inference aims to characterize the posterior probability distributions for parameters in statistical models. The complexity of many inference methods such as MCMC and variational inference, however, typically scale poorly with the growing dimensions of model parameters and data. A recent approach to deal with high or possibly even infinite-dimensional parameters is to exploit low-dimensional structure in the inverse problem and approximately reformulate it in low-to-moderate dimensions. In this presentation, we will introduce an information-theoretic analysis to bound the error from reducing the dimensions of both parameters and data. This bound exploits gradient evaluations of the log-likelihood function to identify relevant low-dimensional subspaces for these variables as well as reveal reduced dimensions that result in minimal error. The benefit of the proposed dimension reduction technique will be demonstrated using several inference algorithms on applications including image processing and data assimilation for aerodynamic flows.
Joint work with Youssef Marzouk (Massachusetts Institute of Technology, USA) and Olivier Zahm (INRIA Grenoble Alpes, France).