D Dvinskikh, A Gasnikov - Journal of Inverse and Ill-posed Problems, 2021 - degruyter.com
We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal …
In the last few years, the theory of decentralized distributed convex optimization has made significant progress. The lower bounds on communications rounds and oracle calls have …
We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as …
F Stonyakin, A Tyurin, A Gasnikov… - Optimization Methods …, 2021 - Taylor & Francis
In this paper, we propose a general algorithmic framework for the first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and …
First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class …
We consider smooth stochastic convex optimization problems in the context of algorithms which are based on directional derivatives of the objective function. This context can be …
We present a stochastic descent algorithm for unconstrained optimization that is particularly efficient when the objective function is slow to evaluate and gradients are not easily …
An envelope called an accelerated meta-algorithm is proposed. Based on the envelope, accelerated methods for solving convex unconstrained minimization problems in various …
We consider the problem of unconstrained minimization of a smooth objective function in $\mathbb {R}^ d $ in setting where only function evaluations are possible. We propose and …