Estimating mutual information for discrete-continuous mixtures

W Gao, S Kannan, S Oh… - Advances in neural …, 2017 - proceedings.neurips.cc
Estimation of mutual information from observed samples is a basic primitive in machine
learning, useful in several learning tasks including correlation mining, information …

Minimax estimation of functionals of discrete distributions

J Jiao, K Venkat, Y Han… - IEEE Transactions on …, 2015 - ieeexplore.ieee.org
We propose a general methodology for the construction and analysis of essentially minimax
estimators for a wide class of functionals of finite dimensional parameters, and elaborate on …

Minimax rates of entropy estimation on large alphabets via best polynomial approximation

Y Wu, P Yang - IEEE Transactions on Information Theory, 2016 - ieeexplore.ieee.org
Consider the problem of estimating the Shannon entropy of a distribution over k elements
from n independent samples. We show that the minimax mean-square error is within the …

Inhomogeneous hypergraph clustering with applications

P Li, O Milenkovic - Advances in neural information …, 2017 - proceedings.neurips.cc
Hypergraph partitioning is an important problem in machine learning, computer vision and
network analytics. A widely used method for hypergraph partitioning relies on minimizing a …

Evidential reasoning for preprocessing uncertain categorical data for trustworthy decisions: An application on healthcare and finance

S Sachan, F Almaghrabi, JB Yang, DL Xu - Expert Systems with …, 2021 - Elsevier
The uncertainty attributed by discrepant data in AI-enabled decisions is a critical challenge
in highly regulated domains such as health care and finance. Ambiguity and incompleteness …

Multi-round incentive mechanism for cold start-enabled mobile crowdsensing

Y Lin, Z Cai, X Wang, F Hao, L Wang… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Mobile CrowdSensing (MCS) has emerged as a novel paradigm for performing large-scale
sensing tasks. Many incentive mechanisms have been proposed to encourage user …

[HTML][HTML] Empirical estimation of information measures: A literature guide

S Verdú - Entropy, 2019 - mdpi.com
We give a brief survey of the literature on the empirical estimation of entropy, differential
entropy, relative entropy, mutual information and related information measures. While those …

Multivariate trace estimation in constant quantum depth

Y Quek, E Kaur, MM Wilde - Quantum, 2024 - quantum-journal.org
There is a folkloric belief that a depth-$\Theta (m) $ quantum circuit is needed to estimate the
trace of the product of $ m $ density matrices (ie, a multivariate trace), a subroutine crucial to …

Estimation of KL divergence: Optimal minimax rate

Y Bu, S Zou, Y Liang… - IEEE Transactions on …, 2018 - ieeexplore.ieee.org
The problem of estimating the Kullback-Leibler divergence D (P∥ Q) between two unknown
distributions P and Q is studied, under the assumption that the alphabet size k of the …

Estimating Rényi entropy of discrete distributions

J Acharya, A Orlitsky, AT Suresh… - IEEE Transactions on …, 2016 - ieeexplore.ieee.org
It was shown recently that estimating the Shannon entropy H (p) of a discrete k-symbol
distribution p requires Θ (k/log k) samples, a number that grows near-linearly in the support …