Convex optimization is at the core of many of today's analysis tools for large datasets, and in particular machine learning methods. In this thesis we will study the general setting of …
We study the linear convergence of variants of the Frank-Wolfe algorithms for some classes of strongly convex problems, using only affine-invariant quantities. As in Guelat & Marcotte …
J Pena, D Rodriguez - Mathematics of Operations Research, 2019 - pubsonline.informs.org
It is well known that the gradient descent algorithm converges linearly when applied to a strongly convex function with Lipschitz gradient. In this case, the algorithm's rate of …
Recently, there has been a renewed interest in the machine learning community for variants of a sparse greedy approximation procedure for concave optimization known as the Frank …
MA Bashiri, X Zhang - Advances in neural information …, 2017 - proceedings.neurips.cc
Frank-Wolfe (FW) algorithms with linear convergence rates have recently achieved great efficiency in many applications. Garber and Meshi (2016) designed a new decomposition …
We propose a generalized variant of Frank-Wolfe algorithm for solving a class of sparse/low- rank optimization problems. Our formulation includes Elastic Net, regularized SVMs and …
The Frank-Wolfe algorithms, aka conditional gradient algorithms, solve constrained optimization problems. They break down a non-linear problem into a series of linear …
Large-scale optimization problems abound in data mining and machine learning applications, and the computational challenges they pose are often addressed through …
Frank-Wolfe algorithms for convex minimization have recently gained considerable attention from the Optimization and Machine Learning communities, as their properties make them a …