Supplement to “Pathwise coordinate optimization for sparse learning: Algorithm and theory”. The supplementary materials contain the supplementary proofs of the theoretical lemmas in …
Convex optimization is at the core of many of today's analysis tools for large datasets, and in particular machine learning methods. In this thesis we will study the general setting of …
There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of Nesterov [SIAM J. Optim., 22 (2) …
We characterize the effectiveness of a classical algorithm for recovering the Markov graph of a general discrete pairwise graphical model from iid samples. The algorithm is …
We study the question of learning a sparse multivariate polynomial over the real domain. In particular, for some unknown polynomial f (x) of degree-d and k monomials, we show how to …
T Zhang - Advances in neural information processing …, 2008 - proceedings.neurips.cc
We study learning formulations with non-convex regularizaton that are natural for sparse linear models. There are two approaches to this problem:(1) Heuristic methods such as …
Linear regression in Lp-norm is a canonical optimization problem that arises in several applications, including sparse recovery, semi-supervised learning, and signal processing …
T Zhang - Advances in neural information processing …, 2008 - proceedings.neurips.cc
Consider linear prediction models where the target function is a sparse linear combination of a set of basis functions. We are interested in the problem of identifying those basis functions …
We study randomized sketching methods for approximately solving least-squares problem with a general convex constraint. The quality of a least-squares approximation can be …