This article presents a powerful algorithmic framework for big data optimization, called the block successive upper-bound minimization (BSUM). The BSUM includes as special cases …
This paper introduces a parallel and distributed algorithm for solving the following minimization problem with linear constraints: minimize~~ &f_1 (x _1)+ ⋯+ f_N (x _N)\subject …
In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a …
We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves …
Finding a fixed point to a nonexpansive operator, ie, x^*=Tx^*, abstracts many problems in numerical linear algebra, optimization, and other areas of data science. To solve fixed-point …
The block coordinate descent (BCD) method is widely used for minimizing a continuous function f of several block variables. At each iteration of this method, a single block of …
We describe an asynchronous parallel stochastic proximal coordinate descent algorithm for minimizing a composite objective function, which consists of a smooth convex function …
We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a (block) separable nonsmooth, convex …
Y Xu, W Yin - SIAM Journal on Optimization, 2015 - SIAM
The stochastic gradient (SG) method can quickly solve a problem with a large number of components in the objective, or a stochastic optimization problem, to a moderate accuracy …