Demystifying Fixed -Nearest Neighbor Information Estimators

W Gao, S Oh, P Viswanath - IEEE Transactions on Information …, 2018 - ieeexplore.ieee.org
Estimating mutual information from independent identically distributed samples drawn from
an unknown joint density function is a basic statistical problem of broad interest with …

Scalable mutual information estimation using dependence graphs

M Noshad, Y Zeng, AO Hero - ICASSP 2019-2019 IEEE …, 2019 - ieeexplore.ieee.org
The Mutual Information (MI) is an often used measure of dependency between two random
variables utilized in information theory, statistics and machine learning. Recently several MI …

Convergence of smoothed empirical measures with applications to entropy estimation

Z Goldfeld, K Greenewald, J Niles-Weed… - IEEE Transactions …, 2020 - ieeexplore.ieee.org
This paper studies convergence of empirical measures smoothed by a Gaussian kernel.
Specifically, consider approximating P* N σ, for N σ=△ N (0, σ 2 I d), by P̑ n* N σ under …

Ensemble estimation of mutual information

KR Moon, K Sricharan, AO Hero - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
We derive the mean squared error convergence rates of kernel density-based plug-in
estimators of mutual information measures between two multidimensional random variables …

Direct estimation of information divergence using nearest neighbor ratios

M Noshad, KR Moon, SY Sekeh… - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
We propose a direct estimation method for Rényi and f-divergence measures based on a
new graph theoretical interpretation. Suppose that we are given two sample sets X and Y …

Bias correction with jackknife, bootstrap, and taylor series

J Jiao, Y Han - IEEE Transactions on Information Theory, 2020 - ieeexplore.ieee.org
We analyze bias correction methods using jackknife, bootstrap, and Taylor series. We focus
on the binomial model, and consider the problem of bias correction for estimating f (p) …

Learning to bound the multi-class Bayes error

SY Sekeh, B Oselio, AO Hero - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org
In the context of supervised learning, meta learning uses features, metadata and other
information to learn about the difficulty, behavior, or composition of the problem. Using this …

[HTML][HTML] Ensemble estimation of information divergence

KR Moon, K Sricharan, K Greenewald, AO Hero III - Entropy, 2018 - mdpi.com
Recent work has focused on the problem of nonparametric estimation of information
divergence functionals between two continuous random variables. Many existing …

Nonparanormal information estimation

S Singh, B Póczos - International Conference on Machine …, 2017 - proceedings.mlr.press
We study the problem of using iid samples from an unknown multivariate probability
distribution p to estimate the mutual information of p. This problem has recently received …

Learning to benchmark: Determining best achievable misclassification error from training data

M Noshad, L Xu, A Hero - arXiv preprint arXiv:1909.07192, 2019 - arxiv.org
We address the problem of learning to benchmark the best achievable classifier
performance. In this problem the objective is to establish statistically consistent estimates of …