Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models

T Sakai, M Sugiyama - IEICE TRANSACTIONS on Information and …, 2014 - search.ieice.org
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence
between random variables. The sample-based SMI approximator called least-squares …

Estimating mutual information by local Gaussian approximation

S Gao, GV Steeg, A Galstyan - arXiv preprint arXiv:1508.00536, 2015 - arxiv.org
Estimating mutual information (MI) from samples is a fundamental problem in statistics,
machine learning, and data analysis. Recently it was shown that a popular class of non …

Machine learning with squared-loss mutual information

M Sugiyama - Entropy, 2012 - mdpi.com
Mutual information (MI) is useful for detecting statistical independence between random
variables, and it has been successfully applied to solving various machine learning …

[PDF][PDF] Mutual Information Estimation using LSH Sampling.

R Spring, A Shrivastava - IJCAI, 2020 - ijcai.org
Learning representations in an unsupervised or self-supervised manner is a growing area of
research. Current approaches in representation learning seek to maximize the mutual …

Jackknife approach to the estimation of mutual information

X Zeng, Y Xia, H Tong - Proceedings of the National …, 2018 - National Acad Sciences
Quantifying the dependence between two random variables is a fundamental issue in data
analysis, and thus many measures have been proposed. Recent studies have focused on …

Finite sample based mutual information

K Rajab, F Kamalov - IEEE Access, 2021 - ieeexplore.ieee.org
Mutual information is a popular metric in machine learning. In case of a discrete target
variable and a continuous feature variable the mutual information can be calculated as a …

CCMI: Classifier based conditional mutual information estimation

S Mukherjee, H Asnani… - Uncertainty in artificial …, 2020 - proceedings.mlr.press
Abstract Conditional Mutual Information (CMI) is a measure of conditional dependence
between random variables X and Y, given another random variable Z. It can be used to …

Improving mutual information estimation with annealed and energy-based bounds

R Brekelmans, S Huang, M Ghassemi… - arXiv preprint arXiv …, 2023 - arxiv.org
Mutual information (MI) is a fundamental quantity in information theory and machine
learning. However, direct estimation of MI is intractable, even if the true joint probability …

Sliced mutual information: A scalable measure of statistical dependence

Z Goldfeld, K Greenewald - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Mutual information (MI) is a fundamental measure of statistical dependence, with a myriad of
applications to information theory, statistics, and machine learning. While it possesses many …

Critical values of a kernel density-based mutual information estimator

RJ May, GC Dandy, HR Maier… - The 2006 IEEE …, 2006 - ieeexplore.ieee.org
Recently, mutual information (MI) has become widely recognized as a statistical measure of
dependence that is suitable for applications where data are non-Gaussian, or where the …