The Mutual Information (MI) is an often used measure of dependency between two random variables utilized in information theory, statistics and machine learning. Recently several MI …
We derive the mean squared error convergence rates of kernel density-based plug-in estimators of mutual information measures between two multidimensional random variables …
We propose a direct estimation method for Rényi and f-divergence measures based on a new graph theoretical interpretation. Suppose that we are given two sample sets X and Y …
OC Mesner, CR Shalizi - IEEE Transactions on Information …, 2020 - ieeexplore.ieee.org
Fields like public health, public policy, and social science often want to quantify the degree of dependence between variables whose relationships take on unknown functional forms …
P Zhao, L Lai - IEEE Transactions on Information Theory, 2019 - ieeexplore.ieee.org
KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous …
In the context of supervised learning, meta learning uses features, metadata and other information to learn about the difficulty, behavior, or composition of the problem. Using this …
This paper proposes a geometric estimator of dependency between a pair of multivariate random variables. The proposed estimator of dependency is based on a randomly permuted …
M Noshad, A Hero - International Conference on Artificial …, 2018 - proceedings.mlr.press
We propose a scalable divergence estimation method based on hashing. Consider two continuous random variables $ X $ and $ Y $ whose densities have bounded support. We …
Henze-Penrose divergence is a non-parametric divergence measure that can be used to estimate a bound on the Bayes error in a binary classification problem. In this paper, we …