L Chizat - Mathematical Programming, 2022 - Springer
Minimizing a convex function of a measure with a sparsity-inducing penalty is a typical problem arising, eg, in sparse spikes deconvolution or two-layer neural networks training …
G Khan, J Zhang - Information Geometry, 2022 - Springer
Abstract Information geometry and optimal transport are two distinct geometric frameworks for modeling families of probability measures. During the recent years, there has been a …
Particle-based variational inference methods (ParVIs) have gained attention in the Bayesian inference literature, for their capacity to yield flexible and accurate approximations. We …
E Bernton, PE Jacob, M Gerber… - … and Inference: A …, 2019 - academic.oup.com
Statistical inference can be performed by minimizing, over the parameter space, the Wasserstein distance between model distributions and the empirical distribution of the data …
G Fu, S Osher, W Li - Journal of Computational Physics, 2023 - Elsevier
We design and compute first-order implicit-in-time variational schemes with high-order spatial discretization for initial value gradient flows in generalized optimal transport metric …
L Nurbekyan, W Lei, Y Yang - SIAM Journal on Scientific Computing, 2023 - SIAM
We propose efficient numerical schemes for implementing the natural gradient descent (NGD) for a broad range of metric spaces with applications to PDE-based optimization …
Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals with non- smooth Riesz kernels show a rich structure as singular measures can become absolutely …
While likelihood-based inference and its variants provide a statistically efficient and widely applicable approach to parametric inference, their application to models involving …
Y Mroueh, T Nguyen - International Conference on Artificial …, 2021 - proceedings.mlr.press
We consider the maximum mean discrepancy MMD GAN problem and propose a parametric kernelized gradient flow that mimics the min-max game in gradient regularized MMD GAN …