N Nüsken - arXiv preprint arXiv:2409.01464, 2024 - arxiv.org
We introduce $\textit {Stein transport} $, a novel methodology for Bayesian inference designed to efficiently push an ensemble of particles along a predefined curve of tempered …
In this paper, we study efficient approximate sampling for probability distributions known up to normalization constants. We specifically focus on a problem class arising in Bayesian …
This paper explores the connections between tempering (for Sequential Monte Carlo; SMC) and entropic mirror descent to sample from a target probability distribution whose …
We give a comprehensive description of Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals $\mathcal F_\nu:=\text {MMD} _K^ 2 (\cdot,\nu) $ towards …
Sequential-in-time methods solve a sequence of training problems to fit nonlinear parametrizations such as neural networks to approximate solution trajectories of partial …
The dynamics of probability density functions has been extensively studied in science and engineering to understand physical phenomena and facilitate algorithmic design. Of …
H Chen, L Ying - arXiv preprint arXiv:2401.15645, 2024 - arxiv.org
Sampling from a multimodal distribution is a fundamental and challenging problem in computational science and statistics. Among various approaches proposed for this task, one …
It has long been posited that there is a connection between the dynamical equations describing evolutionary processes in biology and sequential Bayesian learning methods …
R Duong, N Rux, V Stein, G Steidl - arXiv preprint arXiv:2411.09848, 2024 - arxiv.org
We consider Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals $\text {MMD} _K^ 2 (\cdot,\nu) $ for positive and negative distance kernels $ K (x, y):=\pm| xy …