In a real Hilbert space H, in order to develop fast optimization methods, we analyze the asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
B Bordelon, C Pehlevan - Advances in Neural Information …, 2022 - proceedings.neurips.cc
We analyze feature learning in infinite-width neural networks trained with gradient flow through a self-consistent dynamical field theory. We construct a collection of deterministic …
Accelerated optimization methods, such as Nesterov's accelerated gradient method, play a significant role in optimization. Several accelerated methods are provably optimal under …
Y Wei, Y Chen, X Zhao, J Cao - IEEE Transactions on Systems …, 2022 - ieeexplore.ieee.org
In this study, a framework for processing gradient algorithms is proposed in accordance with nabla fractional-order system theory. Unlike most of the literature, the gradient algorithm is …
J Kim, I Yang - International Conference on Machine …, 2023 - proceedings.mlr.press
Although Nesterov's accelerated gradient method (AGM) has been studied from various perspectives, it remains unclear why the most popular forms of AGMs must handle convex …
JJ Suh, G Roh, EK Ryu - International Conference on …, 2022 - proceedings.mlr.press
We analyze continuous-time models of accelerated gradient methods through deriving conservation laws in dilated coordinate systems. Namely, instead of analyzing the dynamics …
In this manuscript we study the properties of a family of a second-order differential equations with damping, its discretizations, and their connections with accelerated optimization …
Arguably, the two most popular accelerated or momentum-based optimization methods are Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different …
In this paper we study two-player bilinear zero-sum games with constrained strategy spaces. An instance of natural occurrences of such constraints is when mixed strategies are used …