View abstract

Session II.2 - Continuous Optimization

Poster

Consensus-Based Optimization: On Algorithms, Analysis, and Applications

Konstantin Riedl

Technical University of Munich, Munich Center for Machine Learning, Germany   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

Consensus-based optimization (CBO) is a multi-particle metaheuristic derivative-free optimization method capable of globally minimizing high-dimensional nonconvex and nonsmooth functions, a task which is ubiquitous in applied mathematics. The method is inspired by consensus dynamics and opinion formation, and is flexible enough to incorporate different mechanisms widely used in evolutionary computation and machine learning. In this poster we shed a light on the internal working principles of CBO, which are responsible for the success of the method. Based on an experimentally supported intuition that, on average, CBO performs a gradient descent of the squared Euclidean distance to the global minimizer, we devise a novel technique for proving the convergence to the global minimizer in mean-field law for a rich class of objective functions. In particular, we prove that CBO performs a convexification of a very large class of optimization problems as the number of optimizing particles tends to infinity. Our results hold under minimal assumptions about the initialization of the method and for objectives that are merely locally Lipschitz continuous. From this result it becomes apparent that the hardness of any global optimization problem is necessarily encoded in the mean-field approximation, or, more precisely, in the way how the empirical measure of the finite particle dynamics is used to approximate the mean-field limit. In consideration of the central significance of such approximation with regards to the overall computational complexity of the implemented numerical scheme, we establish a novel probabilistic quantitative result about the convergence of the interacting particle system towards the corresponding mean-field dynamics. By combining both results we provide the first holistic convergence proof of CBO methods on the plane. To demonstrate the versatility and practicability of CBO and its variants we provide numerical experiments for well-understood high-dimensional problems from signal processing and machine learning.

[1] M. Fornasier, T. Klock, and K. Riedl. Consensus-Based Optimization Methods Converge Globally. arXiv:2103.15130, submitted. 2022.

[2] M. Fornasier, T. Klock, and K. Riedl. Convergence of Anisotropic Consensus-Based Optimization in Mean-Field Law. International Conference on the Applications of Evolutionary Computation (Part of EvoStar). Springer, Cham, 2022.

[3] K. Riedl. Leveraging Memory Effects and Gradient Information in Consensus-Based Optimization: On Global Convergence in Mean-Field Law. arXiv:2211.12184, submitted. 2022.

Joint work with Massimo Fornasier (Technical University of Munich, Munich Center for Machine Learning, Germany) and Timo Klock (DeepTech Consulting, Norway).

View abstract PDF