Denoising diffusion probabilistic models

J Ho, A Jain, P Abbeel - Advances in neural information …, 2020 - proceedings.neurips.cc
We present high quality image synthesis results using diffusion probabilistic models, a class
of latent variable models inspired by considerations from nonequilibrium thermodynamics …

Annealed flow transport monte carlo

M Arbel, A Matthews, A Doucet - … Conference on Machine …, 2021 - proceedings.mlr.press
Abstract Annealed Importance Sampling (AIS) and its Sequential Monte Carlo (SMC)
extensions are state-of-the-art methods for estimating normalizing constants of probability …

Universal rate-distortion-perception representations for lossy compression

G Zhang, J Qian, J Chen… - Advances in Neural …, 2021 - proceedings.neurips.cc
In the context of lossy compression, Blau\& Michaeli (2019) adopt a mathematical notion of
perceptual quality and define the information rate-distortion-perception function …

Differentiable annealed importance sampling and the perils of gradient noise

G Zhang, K Hsu, J Li, C Finn… - Advances in Neural …, 2021 - proceedings.neurips.cc
Annealed importance sampling (AIS) and related algorithms are highly effective tools for
marginal likelihood estimation, but are not fully differentiable due to the use of Metropolis …

Multi-rate vae: Train once, get the full rate-distortion curve

J Bae, MR Zhang, M Ruan, E Wang… - arXiv preprint arXiv …, 2022 - arxiv.org
Variational autoencoders (VAEs) are powerful tools for learning latent representations of
data used in a wide range of applications. In practice, VAEs usually require multiple training …

Towards empirical sandwich bounds on the rate-distortion function

Y Yang, S Mandt - arXiv preprint arXiv:2111.12166, 2021 - arxiv.org
Rate-distortion (RD) function, a key quantity in information theory, characterizes the
fundamental limit of how much a data source can be compressed subject to a fidelity …

Neural estimation of the rate-distortion function with applications to operational source coding

E Lei, H Hassani, SS Bidokhti - IEEE Journal on Selected Areas …, 2022 - ieeexplore.ieee.org
A fundamental question in designing lossy data compression schemes is how well one can
do in comparison with the rate-distortion function, which describes the known theoretical …

Improving mutual information estimation with annealed and energy-based bounds

R Brekelmans, S Huang, M Ghassemi… - arXiv preprint arXiv …, 2023 - arxiv.org
Mutual information (MI) is a fundamental quantity in information theory and machine
learning. However, direct estimation of MI is intractable, even if the true joint probability …

All in the exponential family: Bregman duality in thermodynamic variational inference

R Brekelmans, V Masrani, F Wood, GV Steeg… - arXiv preprint arXiv …, 2020 - arxiv.org
The recently proposed Thermodynamic Variational Objective (TVO) leverages
thermodynamic integration to provide a family of variational inference objectives, which both …

Learning multi-modal generative models with permutation-invariant encoders and tighter variational objectives

M Hirt, D Campolo, V Leong… - Transactions on Machine …, 2024 - openreview.net
Devising deep latent variable models for multi-modal data has been a long-standing theme
in machine learning research. Multi-modal Variational Autoencoders (VAEs) have been a …