User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

How is machine learning useful for macroeconomic forecasting?

P Goulet Coulombe, M Leroux… - Journal of Applied …, 2022 - Wiley Online Library
Summary We move beyond Is Machine Learning Useful for Macroeconomic Forecasting? by
adding the how. The current forecasting literature has focused on matching specific …

Learning theory and algorithms for forecasting non-stationary time series

V Kuznetsov, M Mohri - Advances in neural information …, 2015 - proceedings.neurips.cc
We present data-dependent learning bounds for the general scenario of non-stationary non-
mixing stochastic processes. Our learning guarantees are expressed in terms of a data …

Generalization bounds for non-stationary mixing processes

V Kuznetsov, M Mohri - Machine Learning, 2017 - Springer
This paper presents the first generalization bounds for time series prediction with a non-
stationary mixing stochastic process. We prove Rademacher complexity learning bounds for …

Simpler PAC-Bayesian bounds for hostile data

P Alquier, B Guedj - Machine Learning, 2018 - Springer
PAC-Bayesian learning bounds are of the utmost interest to the learning community. Their
role is to connect the generalization ability of an aggregation distribution ρ ρ to its empirical …

Optimal learning with Bernstein online aggregation

O Wintenberger - Machine Learning, 2017 - Springer
We introduce a new recursive aggregation procedure called Bernstein Online Aggregation
(BOA). Its exponential weights include a second order refinement. The procedure is optimal …

Time series prediction and online learning

V Kuznetsov, M Mohri - Conference on Learning Theory, 2016 - proceedings.mlr.press
We present a series of theoretical and algorithmic results combining the benefits of the
statistical learning approach to time series prediction with that of on-line learning. We prove …

On empirical risk minimization with dependent and heavy-tailed data

A Roy, K Balasubramanian… - Advances in Neural …, 2021 - proceedings.neurips.cc
In this work, we establish risk bounds for Empirical Risk Minimization (ERM) with both
dependent and heavy-tailed data-generating processes. We do so by extending the seminal …

Discrepancy-based theory and algorithms for forecasting non-stationary time series

V Kuznetsov, M Mohri - Annals of Mathematics and Artificial Intelligence, 2020 - Springer
We present data-dependent learning bounds for the general scenario of non-stationary non-
mixing stochastic processes. Our learning guarantees are expressed in terms of a data …

PAC-Bayes Generalisation Bounds for Dynamical Systems Including Stable RNNs

D Eringis, J Leth, ZH Tan, R Wisniewski… - Proceedings of the …, 2024 - ojs.aaai.org
In this paper, we derive a PAC-Bayes bound on the generalisation gap, in a supervised time-
series setting for a special class of discrete-time non-linear dynamical systems. This class …