Tightening mutual information-based bounds on generalization error

Y Bu, S Zou, VV Veeravalli - IEEE Journal on Selected Areas in …, 2020 - ieeexplore.ieee.org
An information-theoretic upper bound on the generalization error of supervised learning
algorithms is derived. The bound is constructed in terms of the mutual information between …

Information-theoretic generalization bounds for SGLD via data-dependent estimates

J Negrea, M Haghifam, GK Dziugaite… - Advances in …, 2019 - proceedings.neurips.cc
In this work, we improve upon the stepwise analysis of noisy iterative learning algorithms
initiated by Pensia, Jog, and Loh (2018) and recently extended by Bu, Zou, and Veeravalli …

Sharpened generalization bounds based on conditional mutual information and an application to noisy, iterative algorithms

M Haghifam, J Negrea, A Khisti… - Advances in …, 2020 - proceedings.neurips.cc
The information-theoretic framework of Russo and Zou (2016) and Xu and Raginsky (2017)
provides bounds on the generalization error of a learning algorithm in terms of the mutual …

An exact characterization of the generalization error for the Gibbs algorithm

G Aminian, Y Bu, L Toni… - Advances in Neural …, 2021 - proceedings.neurips.cc
Various approaches have been developed to upper bound the generalization error of a
supervised learning algorithm. However, existing bounds are often loose and lack of …

Chaining mutual information and tightening generalization bounds

A Asadi, E Abbe, S Verdú - Advances in Neural Information …, 2018 - proceedings.neurips.cc
Bounding the generalization error of learning algorithms has a long history, which yet falls
short in explaining various generalization successes including those of deep learning. Two …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj, M Raginsky - arXiv preprint arXiv …, 2023 - arxiv.org
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

Generalization Error Bounds via Rényi-, f-Divergences and Maximal Leakage

AR Esposito, M Gastpar, I Issa - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
In this work, the probability of an event under some joint distribution is bounded by
measuring it with the product of the marginals instead (which is typically easier to analyze) …

Information-theoretic bounds on the generalization error and privacy leakage in federated learning

S Yagli, A Dytso, HV Poor - 2020 IEEE 21st International …, 2020 - ieeexplore.ieee.org
Machine learning algorithms operating on mobile networks can be characterized into three
different categories. First is the classical situation in which the end-user devices send their …

An information-theoretic view of generalization via Wasserstein distance

H Wang, M Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org
We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the
generalization error of learning algorithms. First, we specialize the Wasserstein distance into …

Fine-grained theoretical analysis of federated zeroth-order optimization

J Chen, H Chen, B Gu, H Deng - Advances in Neural …, 2024 - proceedings.neurips.cc
Federated zeroth-order optimization (FedZO) algorithm enjoys the advantages of both zeroth-
order optimization and federated learning, and has shown exceptional performance on …