Score-based generative models (SGMs) have demonstrated remarkable synthesis quality. SGMs rely on a diffusion process that gradually perturbs the data towards a tractable …
Abstract Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference …
S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
For the task of sampling from a density $\pi\propto\exp (-V) $ on $\R^ d $, where $ V $ is possibly non-convex but $ L $-gradient Lipschitz, we prove that averaged Langevin Monte …
It is a fundamental problem to understand the complexity of high-accuracy sampling from a strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
YA Ma, Y Chen, C Jin… - Proceedings of the …, 2019 - National Acad Sciences
Optimization algorithms and Monte Carlo sampling algorithms have provided the computational foundations for the rapid growth in applications of statistical machine learning …
Y Chen, S Chewi, A Salim… - Conference on Learning …, 2022 - proceedings.mlr.press
We study the proximal sampler of Lee, Shen, and Tian (2021) and obtain new convergence guarantees under weaker assumptions than strong log-concavity: namely, our results hold …
R Shen, YT Lee - Advances in Neural Information …, 2019 - proceedings.neurips.cc
Sampling from log-concave distributions is a well researched problem that has many applications in statistics and machine learning. We study the distributions of the form …
S Zhang, S Chewi, M Li… - The Thirty Sixth …, 2023 - proceedings.mlr.press
Abstract Underdamped Langevin Monte Carlo (ULMC) is an algorithm used to sample from unnormalized densities by leveraging the momentum of a particle moving in a potential well …