D Vidaurre, C Bielza… - International Statistical …, 2013 - Wiley Online Library
L1 regularization, or regularization with an L1 penalty, is a popular idea in statistics and machine learning. This paper reviews the concept and application of L1 regularization for …
This first chapter formulates the objectives of compressive sensing. It introduces the standard compressive problem studied throughout the book and reveals its ubiquity in many …
Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors Page 1 The Annals of Statistics 2013, Vol. 41, No. 6, 2786–2819 DOI …
The compressive sensing (CS) framework aims to ease the burden on analog-to-digital converters (ADCs) by reducing the sampling rate required to acquire and stably recover …
Supplementary material for Least squares after model selection in high-dimensional sparse models. The online supplemental article 2 contains a finite sample results for the estimation …
Y Plan, R Vershynin - Communications on pure and Applied …, 2013 - Wiley Online Library
We give the first computationally tractable and almost optimal solution to the problem of one‐ bit compressed sensing, showing how to accurately recover an s‐sparse vector\input …
After a decade of extensive study of the sparse representation synthesis model, we can safely say that this is a mature and stable field, with clear theoretical foundations, and …
TT Cai, A Zhang - IEEE transactions on information theory, 2013 - ieeexplore.ieee.org
This paper considers compressed sensing and affine rank minimization in both noiseless and noisy cases and establishes sharp restricted isometry conditions for sparse signal and …
We propose a method for constructing p-values for general hypotheses in a high- dimensional linear model. The hypotheses can be local for testing a single regression …