Sampling from known probability distributions is a ubiquitous task in computational science, underlying calculations in domains from linguistics to biology and physics. Generative …
This paper introduces a generative model equivariant to Euclidean symmetries: E (n) Equivariant Normalizing Flows (E-NFs). To construct E-NFs, we take the discriminative E (n) …
J Köhler, L Klein, F Noé - International conference on …, 2020 - proceedings.mlr.press
Normalizing flows are exact-likelihood generative neural networks which approximately transform samples from a simple prior distribution to samples of the probability distribution of …
B Tzen, M Raginsky - arXiv preprint arXiv:1905.09883, 2019 - arxiv.org
In deep latent Gaussian models, the latent variable is generated by a time-inhomogeneous Markov chain, where at each time step we pass the current state through a parametric …
C Yildiz, M Heinonen… - Advances in Neural …, 2019 - proceedings.neurips.cc
Abstract We present Ordinary Differential Equation Variational Auto-Encoder (ODE2VAE), a latent second order ODE model for high-dimensional sequential data. Leveraging the …
Many scientific problems require to process data in the form of geometric graphs. Unlike generic graph data, geometric graphs exhibit symmetries of translations, rotations, and/or …
H Wu, J Köhler, F Noé - Advances in Neural Information …, 2020 - proceedings.neurips.cc
The sampling of probability distributions specified up to a normalization constant is an important problem in both machine learning and statistical mechanics. While classical …
A normalizing flow is an invertible mapping between an arbitrary probability distribution and a standard normal distribution; it can be used for density estimation and statistical inference …
The Boltzmann distribution is a natural model for many systems, from brains to materials and biomolecules, but is often of limited utility for fitting data because Monte Carlo algorithms are …