X Li, Y Wu, L Mackey… - Advances in neural …, 2019 - proceedings.neurips.cc
Abstract Sampling with Markov chain Monte Carlo methods typically amounts to discretizing some continuous-time dynamics with numerical integration. In this paper, we establish the …
We consider the problem of sampling from a target distribution, which is not necessarily log- concave, in the context of empirical risk minimization and stochastic optimization as …
We provide a nonasymptotic analysis of the convergence of the stochastic gradient Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without …
In this paper, we are concerned with a non-asymptotic analysis of sampling algorithms used in nonconvex optimization. In particular, we obtain non-asymptotic estimates in Wasserstein …
The randomized midpoint method, proposed by (Shen and Lee, 2019), has emerged as an optimal discretization procedure for simulating the continuous time underdamped Langevin …
Artificial neural networks (ANNs) are typically highly nonlinear systems which are finely tuned via the optimization of their associated, nonconvex loss functions. In many cases, the …
H Manor, T Michaeli - arXiv preprint arXiv:2309.13598, 2023 - arxiv.org
Denoisers play a central role in many applications, from noise suppression in low-grade imaging sensors, to empowering score-based generative models. The latter category of …
V Elvira, E Chouzenoux - IEEE Transactions on Signal …, 2022 - ieeexplore.ieee.org
Adaptive importance sampling (AIS) methods are increasingly used for the approximation of distributions and related intractable integrals in the context of Bayesian inference …
T Johnston, I Lytras, S Sabanis - Journal of Complexity, 2024 - Elsevier
In this article we consider sampling from log concave distributions in Hamiltonian setting, without assuming that the objective gradient is globally Lipschitz. We propose two …