Estimating mutual information (MI) from samples is a fundamental problem in statistics, machine learning, and data analysis. Recently it was shown that a popular class of non …
Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning …
Learning representations in an unsupervised or self-supervised manner is a growing area of research. Current approaches in representation learning seek to maximize the mutual …
X Zeng, Y Xia, H Tong - Proceedings of the National …, 2018 - National Acad Sciences
Quantifying the dependence between two random variables is a fundamental issue in data analysis, and thus many measures have been proposed. Recent studies have focused on …
Mutual information is a popular metric in machine learning. In case of a discrete target variable and a continuous feature variable the mutual information can be calculated as a …
S Mukherjee, H Asnani… - Uncertainty in artificial …, 2020 - proceedings.mlr.press
Abstract Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to …
Mutual information (MI) is a fundamental quantity in information theory and machine learning. However, direct estimation of MI is intractable, even if the true joint probability …
Z Goldfeld, K Greenewald - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Mutual information (MI) is a fundamental measure of statistical dependence, with a myriad of applications to information theory, statistics, and machine learning. While it possesses many …
RJ May, GC Dandy, HR Maier… - The 2006 IEEE …, 2006 - ieeexplore.ieee.org
Recently, mutual information (MI) has become widely recognized as a statistical measure of dependence that is suitable for applications where data are non-Gaussian, or where the …