Session II.5 - Random Matrices
Thursday, June 15, 15:30 ~ 16:00
Predictability and universality in numerical computation via orthogonal polynomials and local laws
Thomas Trogdon
University of Washington, United States - This email address is being protected from spambots. You need JavaScript enabled to view it.
Many iterative algorithms from numerical linear algebra (Krylov subspace methods, in particular) connect directly to polynomials orthogonal with respect to a measure generated by a matrix and a vector, the so-called eigenvector empirical spectral distribution (VESD). The analysis of these algorithms can often be reduced to understanding properties of these orthogonal polynomials. The broad goal of this talk is to use this connection to describe the behavior of these algorithms when applied to random matrices.
For many random matrix distributions, the limiting VESD is known (or at least characterized) via a local law, yet the ill-conditioned nature of the mapping from measure to orthogonal polynomials appears to limit the conclusions one can make. To partially overcome this difficulty, we develop a new Riemann--Hilbert-based perturbation theory for orthogonal polynomials that takes the local law as input. This allows one to make precise statements about the random polynomials as their degree diverges, and therefore, about algorithms applied to random matrices as the number of iterations diverges. Similar technology allows one to understand the behavior of some of these algorithms in finite-precision arithmetic.
Joint work with Percy Deift (NYU), Xiucai Ding (UC Davis), Tyler Chen (NYU) and Elliot Paquette (McGill University).