View abstract

Session II.2 - Continuous Optimization

Friday, June 16, 17:30 ~ 18:00

Adaptive first-order methods for convex optimization

Yura Malitsky

University of Vienna, Austria   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

In this talk, I will present a new way to make the gradient descent and proximal gradient method fully adaptive and at the same time without increasing their iteration cost. We don't need any additional assumptions and even relax the assumption of global Lipschitzness for the differentiable component. The stepsizes approximate the local curvature of the differentiable function and can increase from iteration to iteration. We will discuss some limitations and open problems.

Joint work with Konstantin Mishchenko.

View abstract PDF