Universality of AdaGrad Stepsizes for Stochastic Optimization: Inexact Oracle, Acceleration and Variance Reduction

A Rodomanov, X Jiang, S Stich - arXiv preprint arXiv:2406.06398, 2024 - arxiv.org
We present adaptive gradient methods (both basic and accelerated) for solving convex
composite optimization problems in which the main part is approximately smooth (aka …

Universal methods for variational inequalities: Deterministic and stochastic cases

A Klimza, A Gasnikov, F Stonyakin, M Alkousa - Chaos, Solitons & Fractals, 2024 - Elsevier
In this paper, we propose universal proximal mirror methods to solve the variational
inequality problem with Hölder-continuous operators in both deterministic and stochastic …