Ensemble estimation of generalized mutual information with applications to genomics

KR Moon, K Sricharan, AO Hero - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Mutual information is a measure of the dependence between random variables that has
been used successfully in myriad applications in many fields. Generalized mutual …

Ensemble Estimation of Generalized Mutual Information with Applications to Genomics

KR Moon, K Sricharan, AO Hero III - arXiv preprint arXiv:1701.08083, 2017 - arxiv.org
Mutual information is a measure of the dependence between random variables that has
been used successfully in myriad applications in many fields. Generalized mutual …

Ensemble estimation of mutual information

KR Moon, K Sricharan, AO Hero - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
We derive the mean squared error convergence rates of kernel density-based plug-in
estimators of mutual information measures between two multidimensional random variables …

Jackknife approach to the estimation of mutual information

X Zeng, Y Xia, H Tong - Proceedings of the National …, 2018 - National Acad Sciences
Quantifying the dependence between two random variables is a fundamental issue in data
analysis, and thus many measures have been proposed. Recent studies have focused on …

Estimating mutual information by local Gaussian approximation

S Gao, GV Steeg, A Galstyan - arXiv preprint arXiv:1508.00536, 2015 - arxiv.org
Estimating mutual information (MI) from samples is a fundamental problem in statistics,
machine learning, and data analysis. Recently it was shown that a popular class of non …

A computationally efficient estimator for mutual information

D Evans - Proceedings of the Royal Society A …, 2008 - royalsocietypublishing.org
Mutual information quantifies the determinism that exists in a relationship between random
variables, and thus plays an important role in exploratory data analysis. We investigate a …

Exponential concentration for mutual information estimation with application to forests

H Liu, L Wasserman, J Lafferty - Advances in Neural …, 2012 - proceedings.neurips.cc
We prove a new exponential concentration inequality for a plug-in estimator of the Shannon
mutual information. Previous results on mutual information estimation only bounded …

Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models

T Sakai, M Sugiyama - IEICE TRANSACTIONS on Information and …, 2014 - search.ieice.org
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence
between random variables. The sample-based SMI approximator called least-squares …

Analysis of KNN information estimators for smooth distributions

P Zhao, L Lai - IEEE Transactions on Information Theory, 2019 - ieeexplore.ieee.org
KSG mutual information estimator, which is based on the distances of each sample to its k-th
nearest neighbor, is widely used to estimate mutual information between two continuous …

[HTML][HTML] Bayesian and quasi-Bayesian estimators for mutual information from discrete data

E Archer, IM Park, JW Pillow - Entropy, 2013 - mdpi.com
Mutual information (MI) quantifies the statistical dependency between a pair of random
variables, and plays a central role in the analysis of engineering and biological systems …