Mutual Information Estimation via Normalizing Flows

I Butakov, A Tolmachev, S Malanchuk… - arXiv preprint arXiv …, 2024 - arxiv.org
We propose a novel approach to the problem of mutual information (MI) estimation via
introducing normalizing flows-based estimator. The estimator maps original data to the …

Lsmi-sinkhorn: Semi-supervised mutual information estimation with optimal transport

Y Liu, M Yamada, YHH Tsai, T Le… - Machine Learning and …, 2021 - Springer
Estimating mutual information is an important statistics and machine learning problem. To
estimate the mutual information from data, a common practice is preparing a set of paired …

[PDF][PDF] Mutual Information Estimation using LSH Sampling.

R Spring, A Shrivastava - IJCAI, 2020 - ijcai.org
Learning representations in an unsupervised or self-supervised manner is a growing area of
research. Current approaches in representation learning seek to maximize the mutual …

Mutual information as a function of moments

W Alghamdi, FP Calmon - 2019 IEEE International Symposium …, 2019 - ieeexplore.ieee.org
We introduce a mutual information estimator based on the connection between estimation
theory and information theory. By combining a polynomial approximation of the minimum …

Investigation of alternative measures for mutual information

B Kuskonmaz, JS Gundersen, R Wisniewski - IFAC-PapersOnLine, 2022 - Elsevier
Abstract Mutual information I (X; Y) is a useful definition in information theory to estimate how
much information the random variable Y holds about the random variable X. One way to …

Mutual information approximation via maximum likelihood estimation of density ratio

T Suzuki, M Sugiyama, T Tanaka - 2009 IEEE International …, 2009 - ieeexplore.ieee.org
We propose a new method of approximating mutual information based on maximum
likelihood estimation of a density ratio function. The proposed method, Maximum Likelihood …

Bayesian and quasi-Bayesian estimators for mutual information from discrete data

E Archer, IM Park, JW Pillow - Entropy, 2013 - mdpi.com
Mutual information (MI) quantifies the statistical dependency between a pair of random
variables, and plays a central role in the analysis of engineering and biological systems …

Neural estimators for conditional mutual information using nearest neighbors sampling

S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set
of samples is a longstanding problem. A recent line of work in this area has leveraged the …

Ensemble estimation of mutual information

KR Moon, K Sricharan, AO Hero - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
We derive the mean squared error convergence rates of kernel density-based plug-in
estimators of mutual information measures between two multidimensional random variables …

Ensemble estimation of generalized mutual information with applications to genomics

KR Moon, K Sricharan, AO Hero - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Mutual information is a measure of the dependence between random variables that has
been used successfully in myriad applications in many fields. Generalized mutual …