View abstract

Session II.2 - Continuous Optimization

Poster

Accelerated Adaptive Cubic Regularized Quasi-Newton Methods

Dmitry Kamzolov

Mohamed bin Zayed University of Artificial Intelligence(MBZUAI), UAE, United Arab Emirates (UAE)   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

In this paper, we propose Cubic Regularized Quasi-Newton Methods for (strongly) star-convex and Accelerated Cubic Regularized Quasi-Newton for convex optimization. The proposed class of algorithms combines the global convergence of the Cubic Newton with the affordability of constructing the Hessian approximation via a Quasi-Newton update. Cubic Regularized Quasi-Newton Method is the first Quasi-Newton Method with the global convergence rate $O\left(\frac{LR^2}{T}\right)$. Accelerated Cubic Regularized Quasi-Newton is the first Quasi-Newton Method with the global convergence rate $O\left(\frac{LR^2}{T^2}\right)$. To construct these methods, we introduce two novel concepts of Hessian inexactness and propose the Adaptive Inexact Cubic Regularized Newton algorithm and its accelerated version, which adjust to the approximation error. We show that different Quasi-Newton updates satisfy these approximation conditions and provide a complexity-efficient way to solve the cubic subproblem. We provide global convergence rates with explicit dependence on Hessian approximation error. By combining cubic regularization, acceleration technique, and adaptivity, our algorithms show good practical performance.

Full paper is available: https://arxiv.org/pdf/2302.04987.pdf

Joint work with Klea Ziu (Mohamed bin Zayed University of Artificial Intelligence(MBZUAI), UAE), Artem Agafonov (Mohamed bin Zayed University of Artificial Intelligence(MBZUAI), UAE) and Martin Takáč (Mohamed bin Zayed University of Artificial Intelligence(MBZUAI), UAE).

View abstract PDF