作者
Jiantao Jiao, Kartik Venkat, Yanjun Han, Tsachy Weissman
发表日期
2017/10
期刊
IEEE Transactions on Information Theory
卷号
63
期号
10
页码范围
6774 - 6798
出版商
IEEE
简介
We consider the problem of estimating functionals of discrete distributions, and focus on a tight (up to universal multiplicative constants for each specific functional) nonasymptotic analysis of the worst case squared error risk of widely used estimators. We apply concentration inequalities to analyze the random fluctuation of these estimators around their expectations and the theory of approximation using positive linear operators to analyze the deviation of their expectations from the true functional, namely their bias. We explicitly characterize the worst case squared error risk incurred by the maximum likelihood estimator (MLE) in estimating the Shannon entropy H(P) = Σ i=1 S -p i ln p i , and the power sum F α (P) = Σ i=1 S p i α , α > 0, up to universal multiplicative constants for each fixed functional, for any alphabet size S ≤ ∞ and sample size n for which the risk may vanish. As a corollary, for Shannon entropy …
引用总数
20142015201620172018201920202021202220232024397823242624563
学术搜索中的文章
J Jiao, K Venkat, Y Han, T Weissman - IEEE Transactions on Information Theory, 2017
Y Han, J Jiao, T Weissman - 2015 IEEE International Symposium on Information …, 2015
J Jiao, K Venkat, Y Han, T Weissman - 2015 IEEE international symposium on information …, 2015