We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE implementation (together with a corrector step) of score-based generative modeling. Our …
Diffusion models are a powerful method for generating approximate samples from high- dimensional data distributions. Several recent results have provided polynomial bounds on …
S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Variational inference (VI) seeks to approximate a target distribution $\pi $ by an element of a tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
It is a fundamental problem to understand the complexity of high-accuracy sampling from a strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
S Mei, Y Wu - arXiv preprint arXiv:2309.11420, 2023 - arxiv.org
We investigate the approximation efficiency of score functions by deep neural networks in diffusion-based generative modeling. While existing approximation theories utilize the …
B Klartag, J Lehec - Geometric and functional analysis, 2022 - Springer
Bourgain’s slicing problem and KLS isoperimetry up to polylog | Geometric and Functional Analysis Skip to main content SpringerLink Account Menu Find a journal Publish with us Track …
S Zhang, S Chewi, M Li… - The Thirty Sixth …, 2023 - proceedings.mlr.press
Abstract Underdamped Langevin Monte Carlo (ULMC) is an algorithm used to sample from unnormalized densities by leveraging the momentum of a particle moving in a potential well …
Statistical Optimal Transport arXiv:2407.18163v2 [math.ST] 7 Nov 2024 Page 1 Statistical Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …