A Salim, P Richtarik - Advances in Neural Information …, 2020 - proceedings.neurips.cc
We consider the task of sampling with respect to a log concave probability distribution. The potential of the target distribution is assumed to be composite, ie, written as the sum of a …
A Wibisono - arXiv preprint arXiv:1911.01469, 2019 - arxiv.org
We study the Proximal Langevin Algorithm (PLA) for sampling from a probability distribution $\nu= e^{-f} $ on $\mathbb {R}^ n $ under isoperimetry. We prove a convergence guarantee …
Data augmentation, by the introduction of auxiliary variables, has become an ubiquitous technique to improve convergence properties, simplify the implementation or reduce the …
This paper is concerned with sampling from probability distributions on admitting a density of the form, where, with being a linear operator and being nondifferentiable. Two different …
Bayesian methods for solving inverse problems are a powerful alternative to classical methods since the Bayesian approach offers the ability to quantify the uncertainty in the …
Federated sampling algorithms have recently gained great popularity in the community of machine learning and statistics. This paper studies variants of such algorithms called Error …
Bayesian methods for solving inverse problems are a powerful alternative to classical methods since the Bayesian approach offers the ability to quantify the uncertainty in the …
In order to solve tasks like uncertainty quantification or hypothesis tests in Bayesian imaging inverse problems, we often have to draw samples from the arising posterior distribution. For …
TTK Lau, H Liu - International Conference on Machine …, 2022 - proceedings.mlr.press
Abstract We propose efficient Langevin Monte Carlo algorithms for sampling distributions with nonsmooth convex composite potentials, which is the sum of a continuously …