Session I.6 - Mathematical Foundations of Data Assimilation and Inverse Problems
Poster
Reduced Order Methods for Linear Gaussian Inverse Problems on separable Hilbert Spaces
Giuseppe Carere
University of Potsdam, Germany - This email address is being protected from spambots. You need JavaScript enabled to view it.
In Bayesian inverse problems, the computation of the posterior distribution can be computationally demanding, especially in many-query settings such as filtering, where a new posterior distribution must be computed many times. In this work we consider some computationally efficient approximations of the posterior distribution for linear Gaussian inverse problems defined on separable Hilbert spaces. We measure the quality of these approximations using the Kullback-Leibler divergence of the approximate posterior with respect to the true posterior and investigate their optimality properties. The approximation method exploits low dimensional behaviour of the update from prior to posterior, originating from a combination of prior smoothing, forward smoothing, measurement error and limited number of observations, analogous to the results of Spantini et al. [1] for finite dimensional parameter spaces. Since the data is only informative on a low dimensional subspace of the parameter space, the approximation class we consider for the posterior covariance consists of suitable low rank updates of the prior. In the Hilbert space setting, care must be taken when inverting covariance operators. We address this challenge by using the Feldman-Hajek theorem for Gaussian measures.
[1] Spantini, Alessio, Antti Solonen, Tiangang Cui, James Martin, Luis Tenorio, and Youssef Marzouk. “Optimal Low-Rank Approximations of Bayesian Linear Inverse Problems.” SIAM Journal on Scientific Computing 37, no. 6 (January 2015): A2451–87. https://doi.org/10.1137/140977308.
Joint work with Han Cheng Lie.