Abstract Annealed Importance Sampling (AIS) and its Sequential Monte Carlo (SMC) extensions are state-of-the-art methods for estimating normalizing constants of probability …
G Zhang, J Qian, J Chen… - Advances in Neural …, 2021 - proceedings.neurips.cc
In the context of lossy compression, Blau\& Michaeli (2019) adopt a mathematical notion of perceptual quality and define the information rate-distortion-perception function …
Annealed importance sampling (AIS) and related algorithms are highly effective tools for marginal likelihood estimation, but are not fully differentiable due to the use of Metropolis …
Variational autoencoders (VAEs) are powerful tools for learning latent representations of data used in a wide range of applications. In practice, VAEs usually require multiple training …
Y Yang, S Mandt - arXiv preprint arXiv:2111.12166, 2021 - arxiv.org
Rate-distortion (RD) function, a key quantity in information theory, characterizes the fundamental limit of how much a data source can be compressed subject to a fidelity …
A fundamental question in designing lossy data compression schemes is how well one can do in comparison with the rate-distortion function, which describes the known theoretical …
Mutual information (MI) is a fundamental quantity in information theory and machine learning. However, direct estimation of MI is intractable, even if the true joint probability …
The recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a family of variational inference objectives, which both …
Devising deep latent variable models for multi-modal data has been a long-standing theme in machine learning research. Multi-modal Variational Autoencoders (VAEs) have been a …