Common information, noise stability, and their extensions

L Yu, VYF Tan - Foundations and Trends® in …, 2022 - nowpublishers.com
Common information is ubiquitous in information theory and related areas such as
theoretical computer science and discrete probability. However, because there are multiple …

On universal features for high-dimensional learning and inference

SL Huang, A Makur, GW Wornell, L Zheng - arXiv preprint arXiv …, 2019 - arxiv.org
We consider the problem of identifying universal low-dimensional features from high-
dimensional data for inference tasks in settings involving learning. For such problems, we …

Comparison of Contraction Coefficients for f-Divergences

A Makur, L Zheng - Problems of Information Transmission, 2020 - Springer
Contraction coefficients are distribution dependent constants that are used to sharpen
standard data processing inequalities for f-divergences (or relative f-entropies) and produce …

On non-interactive simulation of joint distributions

S Kamath, V Anantharam - IEEE Transactions on Information …, 2016 - ieeexplore.ieee.org
We consider the following non-interactive simulation problem: Alice and Bob observe
sequences X n and Y n, respectively, where ((X i, Y i)} i= 1 n are drawn independent …

Information contraction and decomposition

A Makur - 2019 - dspace.mit.edu
These inequalities can be tightened to produce" strong" data processing inequalities
(SDPIs), which are obtained by introducing appropriate channel-dependent or source …

Which Boolean functions maximize mutual information on noisy inputs?

TA Courtade, GR Kumar - IEEE Transactions on Information …, 2014 - ieeexplore.ieee.org
We pose a simply stated conjecture regarding the maximum mutual information a Boolean
function can reveal about noisy inputs. Specifically, let X n be independent identically …

On hypercontractivity and the mutual information between Boolean functions

V Anantharam, AA Gohari, S Kamath… - 2013 51st Annual …, 2013 - ieeexplore.ieee.org
Hypercontractivity has had many successful applications in mathematics, physics, and
theoretical computer science. In this work we use recently established properties of the …

Bounds on inference

FP Calmon, M Varia, M Médard… - 2013 51st Annual …, 2013 - ieeexplore.ieee.org
Lower bounds for the average probability of error of estimating a hidden variable X given an
observation of a correlated random variable Y, and Fano's inequality in particular, play a …

Remarks on the most informative function conjecture at fixed mean

G Kindler, R O'Donnell, D Witmer - arXiv preprint arXiv:1506.03167, 2015 - arxiv.org
In 2013, Courtade and Kumar posed the following problem: Let $\boldsymbol {x}\sim\{\pm
1\}^ n $ be uniformly random, and form $\boldsymbol {y}\sim\{\pm 1\}^ n $ by negating each …

A study of local approximations in information theory

A Makur - 2015 - dspace.mit.edu
The intractability of many information theoretic problems arises from the meaningful but
nonlinear definition of Kullback-Leibler (KL) divergence between two probability …