Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions

S Chen, S Chewi, J Li, Y Li, A Salim… - arXiv preprint arXiv …, 2022 - arxiv.org
We provide theoretical convergence guarantees for score-based generative models (SGMs)
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …

Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data

M Chen, K Huang, T Zhao… - … Conference on Machine …, 2023 - proceedings.mlr.press
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …

Stochastic interpolants: A unifying framework for flows and diffusions

MS Albergo, NM Boffi, E Vanden-Eijnden - arXiv preprint arXiv …, 2023 - arxiv.org
A class of generative models that unifies flow-based and diffusion-based methods is
introduced. These models extend the framework proposed in Albergo & Vanden-Eijnden …

The probability flow ode is provably fast

S Chen, S Chewi, H Lee, Y Li, J Lu… - Advances in Neural …, 2024 - proceedings.neurips.cc
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE
implementation (together with a corrector step) of score-based generative modeling. Our …

Restoration-degradation beyond linear diffusions: A non-asymptotic analysis for ddim-type samplers

S Chen, G Daras, A Dimakis - International Conference on …, 2023 - proceedings.mlr.press
We develop a framework for non-asymptotic analysis of deterministic samplers used for
diffusion generative modeling. Several recent works have analyzed stochastic samplers …

Diffusion models are minimax optimal distribution estimators

K Oko, S Akiyama, T Suzuki - International Conference on …, 2023 - proceedings.mlr.press
While efficient distribution learning is no doubt behind the groundbreaking success of
diffusion modeling, its theoretical guarantees are quite limited. In this paper, we provide the …

Linear convergence bounds for diffusion models via stochastic localization

J Benton, V De Bortoli, A Doucet… - arXiv preprint arXiv …, 2023 - arxiv.org
Diffusion models are a powerful method for generating approximate samples from high-
dimensional data distributions. Several recent results have provided polynomial bounds on …

Learning mixtures of gaussians using the DDPM objective

K Shah, S Chen, A Klivans - Advances in Neural …, 2023 - proceedings.neurips.cc
Recent works have shown that diffusion models can learn essentially any distribution
provided one can perform score estimation. Yet it remains poorly understood under what …

Reward-directed conditional diffusion: Provable distribution estimation and reward improvement

H Yuan, K Huang, C Ni, M Chen… - Advances in Neural …, 2024 - proceedings.neurips.cc
We explore the methodology and theory of reward-directed generation via conditional
diffusion models. Directed generation aims to generate samples with desired properties as …

Image denoising: The deep learning revolution and beyond—a survey paper

M Elad, B Kawar, G Vaksman - SIAM Journal on Imaging Sciences, 2023 - SIAM
Image denoising—removal of additive white Gaussian noise from an image—is one of the
oldest and most studied problems in image processing. Extensive work over several …