A survey on distributed online optimization and online games

X Li, L Xie, N Li - Annual Reviews in Control, 2023 - Elsevier
Distributed online optimization and online games have been increasingly researched in the
last decade, mostly motivated by their wide applications in sensor networks, robotics (eg …

Fast optimization via inertial dynamics with closed-loop damping

H Attouch, RI Boţ, ER Csetnek - Journal of the European Mathematical …, 2022 - ems.press
In a real Hilbert space H, in order to develop fast optimization methods, we analyze the
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …

Self-consistent dynamical field theory of kernel evolution in wide neural networks

B Bordelon, C Pehlevan - Advances in Neural Information …, 2022 - proceedings.neurips.cc
We analyze feature learning in infinite-width neural networks trained with gradient flow
through a self-consistent dynamical field theory. We construct a collection of deterministic …

A Lyapunov analysis of accelerated methods in optimization

AC Wilson, B Recht, MI Jordan - Journal of Machine Learning Research, 2021 - jmlr.org
Accelerated optimization methods, such as Nesterov's accelerated gradient method, play a
significant role in optimization. Several accelerated methods are provably optimal under …

Analysis and synthesis of gradient algorithms based on fractional-order system theory

Y Wei, Y Chen, X Zhao, J Cao - IEEE Transactions on Systems …, 2022 - ieeexplore.ieee.org
In this study, a framework for processing gradient algorithms is proposed in accordance with
nabla fractional-order system theory. Unlike most of the literature, the gradient algorithm is …

Unifying Nesterov's accelerated gradient methods for convex and strongly convex objective functions

J Kim, I Yang - International Conference on Machine …, 2023 - proceedings.mlr.press
Although Nesterov's accelerated gradient method (AGM) has been studied from various
perspectives, it remains unclear why the most popular forms of AGMs must handle convex …

Continuous-time analysis of accelerated gradient methods via conservation laws in dilated coordinate systems

JJ Suh, G Roh, EK Ryu - International Conference on …, 2022 - proceedings.mlr.press
We analyze continuous-time models of accelerated gradient methods through deriving
conservation laws in dilated coordinate systems. Namely, instead of analyzing the dynamics …

The connections between Lyapunov functions for some optimization algorithms and differential equations

JM Sanz Serna, KC Zygalakis - SIAM Journal on Numerical Analysis, 2021 - SIAM
In this manuscript we study the properties of a family of a second-order differential equations
with damping, its discretizations, and their connections with accelerated optimization …

Conformal symplectic and relativistic optimization

G França, J Sulam, D Robinson… - Advances in Neural …, 2020 - proceedings.neurips.cc
Arguably, the two most popular accelerated or momentum-based optimization methods are
Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different …

Alternating mirror descent for constrained min-max games

A Wibisono, M Tao, G Piliouras - Advances in Neural …, 2022 - proceedings.neurips.cc
In this paper we study two-player bilinear zero-sum games with constrained strategy spaces.
An instance of natural occurrences of such constraints is when mixed strategies are used …