User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Learning sparse nonparametric dags

X Zheng, C Dan, B Aragam… - International …, 2020 - proceedings.mlr.press
We develop a framework for learning sparse nonparametric directed acyclic graphs (DAGs)
from data. Our approach is based on a recent algebraic characterization of DAGs that led to …

A primer on PAC-Bayesian learning

B Guedj - arXiv preprint arXiv:1901.05353, 2019 - arxiv.org
Generalised Bayesian learning algorithms are increasingly popular in machine learning,
due to their PAC generalisation properties and flexibility. The present paper aims at …

[图书][B] Sufficient dimension reduction: Methods and applications with R

B Li - 2018 - taylorfrancis.com
Sufficient dimension reduction is a rapidly developing research field that has wide
applications in regression diagnostics, data visualization, machine learning, genomics …

On the properties of variational approximations of Gibbs posteriors

P Alquier, J Ridgway, N Chopin - Journal of Machine Learning Research, 2016 - jmlr.org
The PAC-Bayesian approach is a powerful set of techniques to derive nonasymptotic risk
bounds for random estimators. The corresponding optimal distribution of estimators, usually …

The generalized lasso with non-linear observations

Y Plan, R Vershynin - IEEE Transactions on information theory, 2016 - ieeexplore.ieee.org
We study the problem of signal estimation from non-linear observations when the signal
belongs to a low-dimensional set buried in a high-dimensional space. A rough heuristic …

Online pac-bayes learning

M Haddouche, B Guedj - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Most PAC-Bayesian bounds hold in the batch learning setting where data is collected at
once, prior to inference or prediction. This somewhat departs from many contemporary …

PAC-Bayes generalisation bounds for heavy-tailed losses through supermartingales

M Haddouche, B Guedj - arXiv preprint arXiv:2210.00928, 2022 - arxiv.org
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj, M Raginsky - arXiv preprint arXiv …, 2023 - arxiv.org
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

High-dimensional estimation with geometric constraints

Y Plan, R Vershynin, E Yudovina - Information and Inference: A …, 2017 - academic.oup.com
Consider measuring a vector through the inner product with several measurement vectors,. It
is common in both signal processing and statistics to assume the linear response model …