Acceleration methods

A d'Aspremont, D Scieur, A Taylor - Foundations and Trends® …, 2021 - nowpublishers.com
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …

Understanding the acceleration phenomenon via high-resolution differential equations

B Shi, SS Du, MI Jordan, WJ Su - Mathematical Programming, 2022 - Springer
Gradient-based optimization algorithms can be studied from the perspective of limiting
ordinary differential equations (ODEs). Motivated by the fact that existing ODEs do not …

An optimal first order method based on optimal quadratic averaging

D Drusvyatskiy, M Fazel, S Roy - SIAM Journal on Optimization, 2018 - SIAM
In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for
minimizing smooth strongly convex functions. Their geometric descent algorithm, largely …

A geometric structure of acceleration and its role in making gradients small fast

J Lee, C Park, E Ryu - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Since Nesterov's seminal 1983 work, many accelerated first-order optimization methods
have been proposed, but their analyses lacks a common unifying structure. In this work, we …

Heavy Ball Momentum for Non-Strongly Convex Optimization

JF Aujol, C Dossal, H Labarrière… - arXiv preprint arXiv …, 2024 - arxiv.org
When considering the minimization of a quadratic or strongly convex function, it is well
known that first-order methods involving an inertial term weighted by a constant-in-time …

[PDF][PDF] Underestimate sequences via quadratic averaging

C Ma, NVC Gudapati, M Jahani… - arXiv preprint arXiv …, 2017 - researchgate.net
In this work we introduce the concept of an Underestimate Sequence (UES), which is a
natural extension of Nesterov's estimate sequence [16]. Our definition of a UES utilizes three …

Gradient Descent and the Power Method: Exploiting their connection to find the leftmost eigen-pair and escape saddle points

R Tappenden, M Takáč - arXiv preprint arXiv:2211.00866, 2022 - arxiv.org
This work shows that applying Gradient Descent (GD) with a fixed step size to minimize a
(possibly nonconvex) quadratic function is equivalent to running the Power Method (PM) on …

Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences

M Jahani, NVC Gudapati, C Ma, R Tappenden… - Computational …, 2021 - Springer
In this work we introduce the concept of an Underestimate Sequence (UES), which is
motivated by Nesterov's estimate sequence. Our definition of a UES utilizes three …

Étude de méthodes inertielles en optimisation et leur comportement sous conditions de géométrie

H Labarrière - 2023 - theses.hal.science
Ce manuscrit de thèse est consacré à l'optimisation de fonctions convexes composites dans
un cadre déterministe. Dans ce cadre, je me suis concentré sur l'analyse de convergence …

[PDF][PDF] Étude de méthodes inertielles en optimisation et leur comportement sous conditions de géométrie

JF AUJOL - 2023 - researchgate.net
Ces trois années de thèse sont passées à une vitesse folle et c'est en grande partie grâce
aux personnes que j'ai pu cotoyer. Tant dans le cadre du travail qu'en dehors, j'ai eu la …