View abstract

Session II.2 - Continuous Optimization

Poster

Quadratic minimization: from conjugate gradient to an adaptive Heavy-ball method with Polyak step-sizes

Goujaud Baptiste

École Polytechnique, Institut Polytechnique de Paris, France   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

In this work, we propose an adaptive variation on the classical Heavy-ball method for convex quadratic minimization. The adaptivity crucially relies on so-called "Polyak step-sizes", which consist in using the knowledge of the optimal value of the optimization problem at hand instead of problem parameters such as a few eigenvalues of the Hessian of the problem. This method happens to also be equivalent to a variation of the classical conjugate gradient method, and thereby inherits many of its attractive features, including its finite-time convergence, instance optimality, and its worst-case convergence rates. The classical gradient method with Polyak step-sizes is known to behave very well in situations in which it can be used, and the question of whether incorporating momentum in this method is possible and can improve the method itself appeared to be open. We provide a definitive answer to this question for minimizing convex quadratic functions, an arguably necessary first step for developing such methods in more general setups.

Joint work with Adrien Taylor (Inria Paris), Aymeric Dieuleveut (École Polytechnique, Institut Polytechnique de Paris).

View abstract PDF