[图书][B] An invitation to compressive sensing

S Foucart, H Rauhut, S Foucart, H Rauhut - 2013 - Springer
This first chapter formulates the objectives of compressive sensing. It introduces the
standard compressive problem studied throughout the book and reveals its ubiquity in many …

Scalable Bayes via barycenter in Wasserstein space

S Srivastava, C Li, DB Dunson - Journal of Machine Learning Research, 2018 - jmlr.org
Divide-and-conquer based methods for Bayesian inference provide a general approach for
tractable posterior inference when the sample size is large. These methods divide the data …

Moving beyond sub-Gaussianity in high-dimensional statistics: Applications in covariance estimation and linear regression

AK Kuchibhotla, A Chakrabortty - … and Inference: A Journal of the …, 2022 - academic.oup.com
Concentration inequalities form an essential toolkit in the study of high-dimensional
statistical methods. Most of the relevant statistics literature in this regard is, however, based …

Finite time LTI system identification

T Sarkar, A Rakhlin, MA Dahleh - Journal of Machine Learning Research, 2021 - jmlr.org
We address the problem of learning the parameters of a stable linear time invariant (LTI)
system with unknown latent space dimension, or order, from a single time—series of noisy …

Reinforcement learning trees

R Zhu, D Zeng, MR Kosorok - Journal of the American Statistical …, 2015 - Taylor & Francis
In this article, we introduce a new type of tree-based method, reinforcement learning trees
(RLT), which exhibits significantly improved performance over traditional methods such as …

Tail bounds via generic chaining

S Dirksen - 2015 - projecteuclid.org
We modify Talagrand's generic chaining method to obtain upper bounds for all p-th
moments of the supremum of a stochastic process. These bounds lead to an estimate for the …

How correlations influence lasso prediction

M Hebiri, J Lederer - IEEE Transactions on Information Theory, 2012 - ieeexplore.ieee.org
We study how correlations in the design matrix influence Lasso prediction. First, we argue
that the higher the correlations, the smaller the optimal tuning parameter. This implies in …

Cross-validation with confidence

J Lei - Journal of the American Statistical Association, 2020 - Taylor & Francis
Cross-validation is one of the most popular model and tuning parameter selection methods
in statistics and machine learning. Despite its wide applicability, traditional cross-validation …

The group square-root lasso: Theoretical properties and fast algorithms

F Bunea, J Lederer, Y She - IEEE Transactions on Information …, 2013 - ieeexplore.ieee.org
We introduce and study the group square-root lasso (GSRL) method for estimation in high
dimensional sparse regression models with group structure. The new estimator minimizes …

High-dimensional location estimation via norm concentration for subgamma vectors

S Gupta, JCH Lee, E Price - International Conference on …, 2023 - proceedings.mlr.press
In location estimation, we are given $ n $ samples from a known distribution $ f $ shifted by
an unknown translation $\lambda $, and want to estimate $\lambda $ as precisely as …