[PDF][PDF] Mutual Information Estimation using LSH Sampling.

R Spring, A Shrivastava - IJCAI, 2020 - ijcai.org
Learning representations in an unsupervised or self-supervised manner is a growing area of
research. Current approaches in representation learning seek to maximize the mutual …

Estimating mutual information by local Gaussian approximation

S Gao, GV Steeg, A Galstyan - arXiv preprint arXiv:1508.00536, 2015 - arxiv.org
Estimating mutual information (MI) from samples is a fundamental problem in statistics,
machine learning, and data analysis. Recently it was shown that a popular class of non …

Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models

T Sakai, M Sugiyama - IEICE TRANSACTIONS on Information and …, 2014 - search.ieice.org
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence
between random variables. The sample-based SMI approximator called least-squares …

[HTML][HTML] Bayesian and quasi-Bayesian estimators for mutual information from discrete data

E Archer, IM Park, JW Pillow - Entropy, 2013 - mdpi.com
Mutual information (MI) quantifies the statistical dependency between a pair of random
variables, and plays a central role in the analysis of engineering and biological systems …

Smoothed noise contrastive mutual information neural estimation

X Wang, A Al-Bashabsheh, C Zhao, C Chan - Journal of the Franklin …, 2023 - Elsevier
Abstract Information Noise Contrastive Estimation (InfoNCE) is a popular neural estimator of
mutual information (MI). While InfoNCE has demonstrated impressive results in …

Neural estimators for conditional mutual information using nearest neighbors sampling

S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set
of samples is a longstanding problem. A recent line of work in this area has leveraged the …

Estimating total correlation with mutual information estimators

K Bai, P Cheng, W Hao, R Henao… - … Conference on Artificial …, 2023 - proceedings.mlr.press
Total correlation (TC) is a fundamental concept in information theory that measures
statistical dependency among multiple random variables. Recently, TC has shown …

Jackknife approach to the estimation of mutual information

X Zeng, Y Xia, H Tong - Proceedings of the National …, 2018 - National Acad Sciences
Quantifying the dependence between two random variables is a fundamental issue in data
analysis, and thus many measures have been proposed. Recent studies have focused on …

Efficient estimation of mutual information for strongly dependent variables

S Gao, G Ver Steeg, A Galstyan - Artificial intelligence and …, 2015 - proceedings.mlr.press
We demonstrate that a popular class of non-parametric mutual information (MI) estimators
based on k-nearest-neighbor graphs requires number of samples that scales exponentially …

Improving mutual information estimation with annealed and energy-based bounds

R Brekelmans, S Huang, M Ghassemi… - arXiv preprint arXiv …, 2023 - arxiv.org
Mutual information (MI) is a fundamental quantity in information theory and machine
learning. However, direct estimation of MI is intractable, even if the true joint probability …