Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Normalizing flows for probabilistic modeling and inference

G Papamakarios, E Nalisnick, DJ Rezende… - Journal of Machine …, 2021 - jmlr.org
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …

Flow matching for generative modeling

Y Lipman, RTQ Chen, H Ben-Hamu, M Nickel… - arXiv preprint arXiv …, 2022 - arxiv.org
We introduce a new paradigm for generative modeling built on Continuous Normalizing
Flows (CNFs), allowing us to train CNFs at unprecedented scale. Specifically, we present …

Score-based generative modeling in latent space

A Vahdat, K Kreis, J Kautz - Advances in neural information …, 2021 - proceedings.neurips.cc
Score-based generative models (SGMs) have recently demonstrated impressive results in
terms of both sample quality and distribution coverage. However, they are usually applied …

On neural differential equations

P Kidger - arXiv preprint arXiv:2202.02435, 2022 - arxiv.org
The conjoining of dynamical systems and deep learning has become a topic of great
interest. In particular, neural differential equations (NDEs) demonstrate that neural networks …

Grand: Graph neural diffusion

B Chamberlain, J Rowbottom… - International …, 2021 - proceedings.mlr.press
Abstract We present Graph Neural Diffusion (GRAND) that approaches deep learning on
graphs as a continuous diffusion process and treats Graph Neural Networks (GNNs) as …

Normalizing flows: An introduction and review of current methods

I Kobyzev, SJD Prince… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Normalizing Flows are generative models which produce tractable distributions where both
sampling and density evaluation can be efficient and exact. The goal of this survey article is …

Hippo: Recurrent memory with optimal polynomial projections

A Gu, T Dao, S Ermon, A Rudra… - Advances in neural …, 2020 - proceedings.neurips.cc
A central problem in learning from sequential data is representing cumulative history in an
incremental fashion as more data is processed. We introduce a general framework (HiPPO) …

Graph neural ordinary differential equations

M Poli, S Massaroli, J Park, A Yamashita… - arXiv preprint arXiv …, 2019 - arxiv.org
We introduce the framework of continuous--depth graph neural networks (GNNs). Graph
neural ordinary differential equations (GDEs) are formalized as the counterpart to GNNs …

Deep equilibrium optical flow estimation

S Bai, Z Geng, Y Savani… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Many recent state-of-the-art (SOTA) optical flow models use finite-step recurrent update
operations to emulate traditional algorithms by encouraging iterative refinements toward a …