Smoothed noise contrastive mutual information neural estimation

X Wang, A Al-Bashabsheh, C Zhao, C Chan - Journal of the Franklin …, 2023 - Elsevier
Abstract Information Noise Contrastive Estimation (InfoNCE) is a popular neural estimator of
mutual information (MI). While InfoNCE has demonstrated impressive results in …

Tight mutual information estimation with contrastive fenchel-legendre optimization

Q Guo, J Chen, D Wang, Y Yang… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Successful applications of InfoNCE (Information Noise-Contrastive Estimation) and
its variants have popularized the use of contrastive variational mutual information (MI) …

Variational -Divergence and Derangements for Discriminative Mutual Information Estimation

NA Letizia, N Novello, AM Tonello - arXiv preprint arXiv:2305.20025, 2023 - arxiv.org
The accurate estimation of the mutual information is a crucial task in various applications,
including machine learning, communications, and biology, since it enables the …

Understanding the limitations of variational mutual information estimators

J Song, S Ermon - arXiv preprint arXiv:1910.06222, 2019 - arxiv.org
Variational approaches based on neural networks are showing promise for estimating
mutual information (MI) between high dimensional variables. However, they can be difficult …

Adaptive label smoothing for classifier-based mutual information neural estimation

X Wang, A Al-Bashabsheh, C Zhao… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
Estimating the mutual information (MI) by neural networks has achieved significant practical
success, especially in representation learning. Recent results further reduced the variance …

On variational bounds of mutual information

B Poole, S Ozair, A Van Den Oord… - International …, 2019 - proceedings.mlr.press
Abstract Estimating and optimizing Mutual Information (MI) is core to many problems in
machine learning, but bounding MI in high dimensions is challenging. To establish tractable …

Demi: Discriminative estimator of mutual information

R Liao, D Moyer, P Golland, WM Wells - arXiv preprint arXiv:2010.01766, 2020 - arxiv.org
Estimating mutual information between continuous random variables is often intractable and
extremely challenging for high-dimensional data. Recent progress has leveraged neural …

C-MI-GAN: Estimation of conditional mutual information using minmax formulation

A Mondal, A Bhattacharjee… - … on Uncertainty in …, 2020 - proceedings.mlr.press
Estimation of information theoretic quantities such as mutual information and its conditional
variant has drawn interest in recent times owing to their multifaceted applications. Newly …

Data-efficient mutual information neural estimator

X Lin, I Sur, SA Nastase, A Divakaran… - arXiv preprint arXiv …, 2019 - arxiv.org
Measuring Mutual Information (MI) between high-dimensional, continuous, random
variables from observed samples has wide theoretical and practical applications. Recent …

Neural estimators for conditional mutual information using nearest neighbors sampling

S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set
of samples is a longstanding problem. A recent line of work in this area has leveraged the …