Pacgan: The power of two samples in generative adversarial networks

Z Lin, A Khetan, G Fanti, S Oh - Advances in neural …, 2018 - proceedings.neurips.cc
Generative adversarial networks (GANs) are a technique for learning generative models of
complex data distributions from samples. Despite remarkable advances in generating …

The composition theorem for differential privacy

P Kairouz, S Oh, P Viswanath - International conference on …, 2015 - proceedings.mlr.press
Interactive querying of a database degrades the privacy level. In this paper we answer the
fundamental question of characterizing the level of privacy degradation as a function of the …

[图书][B] Network information theory

A El Gamal, YH Kim - 2011 - books.google.com
This comprehensive treatment of network information theory and its applications provides
the first unified coverage of both classical and recent results. With an approach that …

Biometric security from an information-theoretical perspective

T Ignatenko, FMJ Willems - Foundations and Trends® in …, 2012 - nowpublishers.com
In this review, biometric systems are studied from an information theoretical point of view. In
the first part biometric authentication systems are studied. The objective of these systems is …

Estimation in Gaussian noise: Properties of the minimum mean-square error

D Guo, Y Wu, SS Shitz, S Verdú - IEEE Transactions on …, 2011 - ieeexplore.ieee.org
Consider the minimum mean-square error (MMSE) of estimating an arbitrary random
variable from its observation contaminated by Gaussian noise. The MMSE can be regarded …

Single-user beamforming in large-scale MISO systems with per-antenna constant-envelope constraints: The doughnut channel

SK Mohammed, EG Larsson - IEEE Transactions on Wireless …, 2012 - ieeexplore.ieee.org
Large antenna arrays at the transmitter (TX) have recently been shown to achieve
remarkable intra-cell interference suppression at low complexity. However, building large …

Information theoretic proofs of entropy power inequalities

O Rioul - IEEE transactions on information theory, 2010 - ieeexplore.ieee.org
While most useful information theoretic inequalities can be deduced from the basic
properties of entropy or mutual information, up to now Shannon's entropy power inequality …

Generalized entropy power inequalities and monotonicity properties of information

M Madiman, A Barron - IEEE Transactions on Information …, 2007 - ieeexplore.ieee.org
New families of Fisher information and entropy power inequalities for sums of independent
random variables are presented. These inequalities relate the information in the sum of n …

Mismatched estimation and relative entropy

S Verdú - IEEE Transactions on Information Theory, 2010 - ieeexplore.ieee.org
A random variable with distribution P is observed in Gaussian noise and is estimated by a
mismatched minimum mean-square estimator that assumes that the distribution is Q, instead …

The entropy power inequality for quantum systems

R König, G Smith - IEEE Transactions on Information Theory, 2014 - ieeexplore.ieee.org
When two independent analog signals, X and Y are added together giving Z= X+ Y, the
entropy of Z, H (Z), is not a simple function of the entropies H (X) and H (Y), but rather …