Optimal decentralized distributed algorithms for stochastic convex optimization

E Gorbunov, D Dvinskikh, A Gasnikov - arXiv preprint arXiv:1911.07363, 2019 - arxiv.org
We consider stochastic convex optimization problems with affine constraints and develop
several methods using either primal or dual approach to solve it. In the primal case, we use …

Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems

D Dvinskikh, A Gasnikov - Journal of Inverse and Ill-posed Problems, 2021 - degruyter.com
We introduce primal and dual stochastic gradient oracle methods for decentralized convex
optimization problems. Both for primal and dual oracles, the proposed methods are optimal …

Recent theoretical advances in decentralized distributed convex optimization

E Gorbunov, A Rogozin, A Beznosikov… - … and Probability: With a …, 2022 - Springer
In the last few years, the theory of decentralized distributed convex optimization has made
significant progress. The lower bounds on communications rounds and oracle calls have …

Gradient methods for problems with inexact model of the objective

FS Stonyakin, D Dvinskikh, P Dvurechensky… - … Optimization Theory and …, 2019 - Springer
We consider optimization methods for convex minimization problems under inexact
information on the objective function. We introduce inexact model of the objective, which as …

Inexact model: A framework for optimization and variational inequalities

F Stonyakin, A Tyurin, A Gasnikov… - Optimization Methods …, 2021 - Taylor & Francis
In this paper, we propose a general algorithmic framework for the first-order methods in
optimization in a broad sense, including minimization problems, saddle-point problems and …

[HTML][HTML] First-order methods for convex optimization

P Dvurechensky, S Shtern, M Staudigl - EURO Journal on Computational …, 2021 - Elsevier
First-order methods for solving convex optimization problems have been at the forefront of
mathematical optimization in the last 20 years. The rapid development of this important class …

An accelerated directional derivative method for smooth stochastic convex optimization

P Dvurechensky, E Gorbunov, A Gasnikov - European Journal of …, 2021 - Elsevier
We consider smooth stochastic convex optimization problems in the context of algorithms
which are based on directional derivatives of the objective function. This context can be …

A stochastic subspace approach to gradient-free optimization in high dimensions

D Kozak, S Becker, A Doostan, L Tenorio - … Optimization and Applications, 2021 - Springer
We present a stochastic descent algorithm for unconstrained optimization that is particularly
efficient when the objective function is slow to evaluate and gradients are not easily …

Accelerated meta-algorithm for convex optimization problems

AV Gasnikov, DM Dvinskikh, PE Dvurechensky… - Computational …, 2021 - Springer
An envelope called an accelerated meta-algorithm is proposed. Based on the envelope,
accelerated methods for solving convex unconstrained minimization problems in various …

A stochastic derivative free optimization method with momentum

E Gorbunov, A Bibi, O Sener, EH Bergou… - arXiv preprint arXiv …, 2019 - arxiv.org
We consider the problem of unconstrained minimization of a smooth objective function in
$\mathbb {R}^ d $ in setting where only function evaluations are possible. We propose and …