During the past decade there has been an explosion in computation and information technology. With it have come vast amounts of data in a variety of fields such as medicine …
The idea for this book came from the time the authors spent at the Statistics and Applied Mathematical Sciences Institute (SAMSI) in Research Triangle Park in North Carolina …
We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the …
Finding sparse approximate solutions to large underdetermined linear systems of equations is a common problem in signal/image processing and statistics. Basis pursuit, the least …
H Zou, HH Zhang - Annals of statistics, 2009 - ncbi.nlm.nih.gov
We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method …
MJ Wainwright - IEEE transactions on information theory, 2009 - ieeexplore.ieee.org
The problem of consistently estimating the sparsity pattern of a vector beta* isin R p based on observations contaminated by noise arises in various contexts, including signal …
Oracle inequalities and variable selection properties for the Lasso in linear models have been established under a variety of different assumptions on the design matrix. We show in …
The Lasso is an attractive technique for regularization and variable selection for high- dimensional data, where the number of predictor variables pn is potentially much larger than …
J Fan, R Samworth, Y Wu - The Journal of Machine Learning Research, 2009 - jmlr.org
Variable selection in high-dimensional space characterizes many contemporary problems in scientific discovery and decision making. Many frequently-used techniques are based on …