Critical values of a kernel density-based mutual information estimator

RJ May, GC Dandy, HR Maier… - The 2006 IEEE …, 2006 - ieeexplore.ieee.org
Recently, mutual information (MI) has become widely recognized as a statistical measure of
dependence that is suitable for applications where data are non-Gaussian, or where the …

Jackknife approach to the estimation of mutual information

X Zeng, Y Xia, H Tong - Proceedings of the National …, 2018 - National Acad Sciences
Quantifying the dependence between two random variables is a fundamental issue in data
analysis, and thus many measures have been proposed. Recent studies have focused on …

Finite sample based mutual information

K Rajab, F Kamalov - IEEE Access, 2021 - ieeexplore.ieee.org
Mutual information is a popular metric in machine learning. In case of a discrete target
variable and a continuous feature variable the mutual information can be calculated as a …

Neural estimators for conditional mutual information using nearest neighbors sampling

S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set
of samples is a longstanding problem. A recent line of work in this area has leveraged the …

Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models

T Sakai, M Sugiyama - IEICE TRANSACTIONS on Information and …, 2014 - search.ieice.org
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence
between random variables. The sample-based SMI approximator called least-squares …

Estimating mutual information by local Gaussian approximation

S Gao, GV Steeg, A Galstyan - arXiv preprint arXiv:1508.00536, 2015 - arxiv.org
Estimating mutual information (MI) from samples is a fundamental problem in statistics,
machine learning, and data analysis. Recently it was shown that a popular class of non …

On estimating mutual information for feature selection

E Schaffernicht, R Kaltenhaeuser, SS Verma… - … Conference on Artificial …, 2010 - Springer
Mutual Information (MI) is a powerful concept from information theory used in many
application fields. For practical tasks it is often necessary to estimate the Mutual Information …

Smoothed noise contrastive mutual information neural estimation

X Wang, A Al-Bashabsheh, C Zhao, C Chan - Journal of the Franklin …, 2023 - Elsevier
Abstract Information Noise Contrastive Estimation (InfoNCE) is a popular neural estimator of
mutual information (MI). While InfoNCE has demonstrated impressive results in …

Mutual information estimation with random forests

M Koeman, T Heskes - … , ICONIP 2014, Kuching, Malaysia, November 3-6 …, 2014 - Springer
We present a new method for estimating mutual information based on the random forests
classifiers. This method uses random permutation of one of the two variables to create data …

Ensemble estimation of mutual information

KR Moon, K Sricharan, AO Hero - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
We derive the mean squared error convergence rates of kernel density-based plug-in
estimators of mutual information measures between two multidimensional random variables …