Sharpness, restart and acceleration

V Roulet, A d'Aspremont - Advances in Neural Information …, 2017 - proceedings.neurips.cc
The {\L} ojasiewicz inequality shows that H\" olderian error bounds on the minimum of
convex optimization problems hold almost generically. Here, we clarify results of\citet …

ADMM without a fixed penalty parameter: Faster convergence with new adaptive penalization

Y Xu, M Liu, Q Lin, T Yang - Advances in neural information …, 2017 - proceedings.neurips.cc
Alternating direction method of multipliers (ADMM) has received tremendous interest for
solving numerous problems in machine learning, statistics and signal processing. However …

Adaptive restart of accelerated gradient methods under local quadratic growth condition

O Fercoq, Z Qu - IMA Journal of Numerical Analysis, 2019 - academic.oup.com
By analyzing accelerated proximal gradient methods under a local quadratic growth
condition, we show that restarting these algorithms at any frequency gives a globally linearly …

Stochastic convex optimization: Faster local growth implies faster global convergence

Y Xu, Q Lin, T Yang - International Conference on Machine …, 2017 - proceedings.mlr.press
In this paper, a new theory is developed for first-order stochastic convex optimization,
showing that the global convergence rate is sufficiently quantified by a local growth rate of …

Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions

V Apidopoulos, JF Aujol, C Dossal… - Mathematical …, 2021 - Springer
In this paper we study the convergence properties of a Nesterov's family of inertial schemes
which is a specific case of inertial Gradient Descent algorithm in the context of a smooth …

Restarting the accelerated coordinate descent method with a rough strong convexity estimate

O Fercoq, Z Qu - Computational Optimization and Applications, 2020 - Springer
We propose new restarting strategies for the accelerated coordinate descent method. Our
main contribution is to show that for a well chosen sequence of restarting times, the restarted …

Adaptive svrg methods under error bound conditions with unknown growth parameter

Y Xu, Q Lin, T Yang - Advances in Neural Information …, 2017 - proceedings.neurips.cc
Error bound, an inherent property of an optimization problem, has recently revived in the
development of algorithms with improved global convergence without strong convexity. The …

Radial duality part II: applications and algorithms

B Grimmer - Mathematical Programming, 2024 - Springer
The first part of this work established the foundations of a radial duality between
nonnegative optimization problems, inspired by the work of Renegar (SIAM J Optim 26 (4) …

Frank-Wolfe method is automatically adaptive to error bound condition

Y Xu, T Yang - arXiv preprint arXiv:1810.04765, 2018 - arxiv.org
Error bound condition has recently gained revived interest in optimization. It has been
leveraged to derive faster convergence for many popular algorithms, including subgradient …

Proximal gradient algorithm with momentum and flexible parameter restart for nonconvex optimization

Y Zhou, Z Wang, K Ji, Y Liang, V Tarokh - arXiv preprint arXiv:2002.11582, 2020 - arxiv.org
Various types of parameter restart schemes have been proposed for accelerated gradient
algorithms to facilitate their practical convergence in convex optimization. However, the …