An Online Bootstrap for Time Series

N Palm, T Nagler - International Conference on Artificial …, 2024 - proceedings.mlr.press
Resampling methods such as the bootstrap have proven invaluable in the field of machine
learning. However, the applicability of traditional bootstrap methods is limited when dealing …

Bagging Improves Generalization Exponentially

H Jie, D Ying, H Lam, W Yin - arXiv preprint arXiv:2405.14741, 2024 - arxiv.org
Bagging is a popular ensemble technique to improve the accuracy of machine learning
models. It hinges on the well-established rationale that, by repeatedly retraining on …

Gaussian Approximation and Multiplier Bootstrap for Polyak-Ruppert Averaged Linear Stochastic Approximation with Applications to TD Learning

S Samsonov, E Moulines, QM Shao, ZS Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
In this paper, we obtain the Berry-Esseen bound for multivariate normal approximation for
the Polyak-Ruppert averaged iterates of the linear stochastic approximation (LSA) algorithm …

[PDF][PDF] THE STOCHASTIC GRADIENT DESCENT FROM A NONLINEAR TIME SERIES PERSPECTIVE

J LI, Z LOU, S RICHTER, WEIB WU - stat.math.uni-heidelberg.de
This paper revisits the statistical behaviors of Stochastic Gradient Descent (SGD) through a
novel perspective of time series analysis. Traditional approaches, mostly treating SGD as …

SUBSAMPLED ENSEMBLE CAN IMPROVE GENERALIZA-TION TAIL EXPONENTIALLY

TT EXPONENTIALLY - openreview.net
Ensemble learning is a popular technique to improve the accuracy of machine learning
models. It hinges on the rationale that aggregating multiple weak models can lead to better …