A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations. This is the fundamental problem of Bayesian statistics and …
Abstract Annealed Importance Sampling (AIS) and its Sequential Monte Carlo (SMC) extensions are state-of-the-art methods for estimating normalizing constants of probability …
O Mangoubi, A Smith - The Annals of Applied Probability, 2021 - projecteuclid.org
We obtain several quantitative bounds on the mixing properties of an “ideal” Hamiltonian Monte Carlo (HMC) Markov chain for a strongly log-concave target distribution π on R d. Our …
Antimicrobial resistance (AMR) emerges when disease-causing microorganisms develop the ability to withstand the effects of antimicrobial therapy. This phenomenon is often fueled …
HD Dau, N Chopin - Journal of the Royal Statistical Society …, 2022 - academic.oup.com
A standard way to move particles in a sequential Monte Carlo (SMC) sampler is to apply several steps of a Markov chain Monte Carlo (MCMC) kernel. Unfortunately, it is not clear …
Sampling from a complex distribution $\pi $ and approximating its intractable normalizing constant $\mathrm {Z} $ are challenging problems. In this paper, a novel family of …
Importance sampling (IS) is a powerful Monte Carlo (MC) methodology for approximating integrals, for instance in the context of Bayesian inference. In IS, the samples are simulated …
The sampling problem is one of the most widely studied topics in computational chemistry. While various methods exist for sampling along a set of reaction coordinates, many require …
We show how to speed up sequential Monte Carlo (SMC) for Bayesian inference in large data problems by data subsampling. SMC sequentially updates a cloud of particles through …