Estimating mutual information is an important statistics and machine learning problem. To estimate the mutual information from data, a common practice is preparing a set of paired …
Learning representations in an unsupervised or self-supervised manner is a growing area of research. Current approaches in representation learning seek to maximize the mutual …
W Alghamdi, FP Calmon - 2019 IEEE International Symposium …, 2019 - ieeexplore.ieee.org
We introduce a mutual information estimator based on the connection between estimation theory and information theory. By combining a polynomial approximation of the minimum …
Abstract Mutual information I (X; Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to …
We propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. The proposed method, Maximum Likelihood …
Mutual information (MI) quantifies the statistical dependency between a pair of random variables, and plays a central role in the analysis of engineering and biological systems …
S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set of samples is a longstanding problem. A recent line of work in this area has leveraged the …
We derive the mean squared error convergence rates of kernel density-based plug-in estimators of mutual information measures between two multidimensional random variables …
Mutual information is a measure of the dependence between random variables that has been used successfully in myriad applications in many fields. Generalized mutual …