Mirrored langevin dynamics

YP Hsieh, A Kavis, P Rolland… - Advances in Neural …, 2018 - proceedings.neurips.cc
We consider the problem of sampling from constrained distributions, which has posed
significant challenges to both non-asymptotic analysis and algorithmic design. We propose …

An entropic generalization of Caffarelli's contraction theorem via covariance inequalities

S Chewi, AA Pooladian - Comptes …, 2023 - comptes-rendus.academie-sciences …
The optimal transport map between the standard Gaussian measure and an α-strongly log-
concave probability measure is α− 1/2-Lipschitz, as first observed in a celebrated theorem of …

Private convex optimization in general norms

S Gopi, YT Lee, D Liu, R Shen, K Tian - Proceedings of the 2023 Annual ACM …, 2023 - SIAM
We propose a new framework for differentially private optimization of convex functions which
are Lipschitz in an arbitrary norm||·|| x. Our algorithms are based on a regularized …

The Brownian transport map

D Mikulincer, Y Shenfeld - Probability Theory and Related Fields, 2024 - Springer
Contraction properties of transport maps between probability measures play an important
role in the theory of functional inequalities. The actual construction of such maps, however …

On the Lipschitz properties of transportation along heat flows

D Mikulincer, Y Shenfeld - Geometric Aspects of Functional Analysis: Israel …, 2023 - Springer
We prove new Lipschitz properties for transport maps along heat flows, constructed by Kim
and Milman. For (semi)-log-concave measures and Gaussian mixtures, our bounds have …

Transportation onto log-Lipschitz perturbations

M Fathi, D Mikulincer, Y Shenfeld - Calculus of Variations and Partial …, 2024 - Springer
We establish sufficient conditions for the existence of globally Lipschitz transport maps
between probability measures and their log-Lipschitz perturbations, with dimension-free …

Universal approximation using well-conditioned normalizing flows

H Lee, C Pabbaraju, AP Sevekari… - Advances in Neural …, 2021 - proceedings.neurips.cc
Normalizing flows are a widely used class of latent-variable generative models with a
tractable likelihood. Affine-coupling models [Dinh et al., 2014, 2016] are a particularly …

A mean-field theory of lazy training in two-layer neural nets: entropic regularization and controlled McKean-Vlasov dynamics

B Tzen, M Raginsky - arXiv preprint arXiv:2002.01987, 2020 - arxiv.org
We consider the problem of universal approximation of functions by two-layer neural nets
with random weights that are" nearly Gaussian" in the sense of Kullback-Leibler divergence …

Bounds on optimal transport maps onto log-concave measures

M Colombo, M Fathi - Journal of Differential Equations, 2021 - Elsevier
Bounds on optimal transport maps onto log-concave measures - ScienceDirect Skip to main
contentSkip to article Elsevier logo Journals & Books Search RegisterSign in View PDF …

An optimization perspective on log-concave sampling and beyond

S Chewi - 2023 - dspace.mit.edu
The primary contribution of this thesis is to advance the theory of complexity for sampling
from a continuous probability density over R^ d. Some highlights include: a new analysis of …