View abstract

Session II.4 - Foundations of Data Science and Machine Learning

Poster

Convergence rates for Lipschitz learning on very sparse graphs

Tim Roith

Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

This poster is devoted to proving continuum limits of Lipschitz Learning, a graph-based semi-supervised learning method. Given a partial labeling on a finite data set equipped with a graph structure, the goal is to extend the labels to the whole data set without increasing their Lipschitz constant. In particular, we are interested in the continuum limit of this problem when the number of data points tends to infinity. This is first done via $\Gamma$-convergence of the graph Lipschitz constant functional to the functional $u\mapsto\|\nabla u\|_\infty$. The corresponding minimization problem is a Lipschitz extension task which does not admit a unique solution. We then consider absolutely minimizing Lipschitz extensions (AMLEs) which are the unique solution of the infinity Laplace equation. We demonstrate that quantitative convergence rates of AMLEs on graphs to those in the continuum can be derived from a certain ratio convergence property of the graph distance function. This is then applied to arbitrary geometric graphs as well as to homogeneous Poisson point processes. In both cases, our techniques yield convergence rates for very sparsely connected graphs.

Joint work with Leon Bungert (Technische Universität Berlin, Germany) and Jeff Calder (University of Minnesota, United States).

View abstract PDF