Motivated by recent increased interest in optimization algorithms for non-convex optimization in application to training deep neural networks and other optimization problems …
A Gasnikov, A Novitskii, V Novitskii… - arXiv preprint arXiv …, 2022 - arxiv.org
Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity. In this …
We consider stochastic convex optimization problems with affine constraints and develop several methods using either primal or dual approach to solve it. In the primal case, we use …
We consider distributed stochastic variational inequalities (VIs) on unbounded domains with the problem data that is heterogeneous (non-IID) and distributed across many devices. We …
D Dvinskikh, A Gasnikov - Journal of Inverse and Ill-posed Problems, 2021 - degruyter.com
We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal …
In the last few years, the theory of decentralized distributed convex optimization has made significant progress. The lower bounds on communications rounds and oracle calls have …
Consider a convex optimization problem min x∈ Q⊆ Rd f (x)(1) with convex feasible set Q and convex objective f possessing the zeroth-order (gradient/derivativefree) oracle [83]. The …
In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a “black box” …
In this paper we consider the unconstrained minimization problem of a smooth function in R^n in a setting where only function evaluations are possible. We design a novel …