Divide-and-conquer based methods for Bayesian inference provide a general approach for tractable posterior inference when the sample size is large. These methods divide the data …
Concentration inequalities form an essential toolkit in the study of high-dimensional statistical methods. Most of the relevant statistics literature in this regard is, however, based …
We address the problem of learning the parameters of a stable linear time invariant (LTI) system with unknown latent space dimension, or order, from a single time—series of noisy …
R Zhu, D Zeng, MR Kosorok - Journal of the American Statistical …, 2015 - Taylor & Francis
In this article, we introduce a new type of tree-based method, reinforcement learning trees (RLT), which exhibits significantly improved performance over traditional methods such as …
We modify Talagrand's generic chaining method to obtain upper bounds for all p-th moments of the supremum of a stochastic process. These bounds lead to an estimate for the …
M Hebiri, J Lederer - IEEE Transactions on Information Theory, 2012 - ieeexplore.ieee.org
We study how correlations in the design matrix influence Lasso prediction. First, we argue that the higher the correlations, the smaller the optimal tuning parameter. This implies in …
J Lei - Journal of the American Statistical Association, 2020 - Taylor & Francis
Cross-validation is one of the most popular model and tuning parameter selection methods in statistics and machine learning. Despite its wide applicability, traditional cross-validation …
F Bunea, J Lederer, Y She - IEEE Transactions on Information …, 2013 - ieeexplore.ieee.org
We introduce and study the group square-root lasso (GSRL) method for estimation in high dimensional sparse regression models with group structure. The new estimator minimizes …
S Gupta, JCH Lee, E Price - International Conference on …, 2023 - proceedings.mlr.press
In location estimation, we are given $ n $ samples from a known distribution $ f $ shifted by an unknown translation $\lambda $, and want to estimate $\lambda $ as precisely as …