D Kovalev, A Gasnikov - Advances in Neural Information …, 2022 - proceedings.neurips.cc
In this paper, we study the fundamental open question of finding the optimal high-order algorithm for solving smooth convex minimization problems. Arjevani et al.(2019) …
A Sidford, C Zhang - Advances in Neural Information …, 2024 - proceedings.neurips.cc
We consider the problem of minimizing a continuous function given given access to a natural quantum generalization of a stochastic gradient oracle. We provide two new …
Y Carmon, D Hausler, A Jambulapati… - Advances in Neural …, 2022 - proceedings.neurips.cc
We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any $ p\ge …
In this merged paper, we consider the problem of minimizing a convex function with Lipschitz-continuous $ p $-th order derivatives. Given an oracle which when queried at a …
Do you know the difference between an optimist and a pessimist? The former believes we live in the best possible world, and the latter is afraid that the former might be right.… In that …
We propose a near-optimal method for highly smooth convex optimization. More precisely, in the oracle model where one obtains the $ p^{th} $ order Taylor expansion of a function at …
In this paper, we provide near-optimal accelerated first-order methods for minimizing a broad class of smooth nonconvex functions that are unimodal on all lines through a …
J Kim, I Yang - International Conference on Machine …, 2023 - proceedings.mlr.press
Although Nesterov's accelerated gradient method (AGM) has been studied from various perspectives, it remains unclear why the most popular forms of AGMs must handle convex …
R Jiang, A Mokhtari - Advances in Neural Information …, 2024 - proceedings.neurips.cc
In this paper, we propose an accelerated quasi-Newton proximal extragradient method for solving unconstrained smooth convex optimization problems. With access only to the …