Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Geometric clifford algebra networks

D Ruhe, JK Gupta, S De Keninck… - International …, 2023 - proceedings.mlr.press
Abstract We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical
systems. GCANs are based on symmetry group transformations using geometric (Clifford) …

Why Deep Generative Modeling?

JM Tomczak - Deep Generative Modeling, 2021 - Springer
Before we start thinking about (deep) generative modeling, let us consider a simple
example. Imagine we have trained a deep neural network that classifies images (x∈ ℤ D) of …

Survae flows: Surjections to bridge the gap between vaes and flows

D Nielsen, P Jaini, E Hoogeboom… - Advances in …, 2020 - proceedings.neurips.cc
Normalizing flows and variational autoencoders are powerful generative models that can
represent complicated density functions. However, they both impose constraints on the …

Skew orthogonal convolutions

S Singla, S Feizi - International Conference on Machine …, 2021 - proceedings.mlr.press
Training convolutional neural networks with a Lipschitz constraint under the $ l_ {2} $ norm
is useful for provable adversarial robustness, interpretable gradients, stable training, etc …

projUNN: efficient method for training deep networks with unitary matrices

B Kiani, R Balestriero, Y LeCun… - Advances in Neural …, 2022 - proceedings.neurips.cc
In learning with recurrent or very deep feed-forward networks, employing unitary matrices in
each layer can be very effective at maintaining long-range stability. However, restricting …

Improved techniques for deterministic l2 robustness

S Singla, S Feizi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Training convolutional neural networks (CNNs) with a strict 1-Lipschitz constraint under the l
{2} norm is useful for adversarial robustness, interpretable gradients and stable training. 1 …

Butterflyflow: Building invertible layers with butterfly matrices

C Meng, L Zhou, K Choi, T Dao… - … on Machine Learning, 2022 - proceedings.mlr.press
Normalizing flows model complex probability distributions using maps obtained by
composing invertible layers. Special linear layers such as masked and 1 {\texttimes} 1 …

1-Lipschitz Layers Compared: Memory Speed and Certifiable Robustness

B Prach, F Brau, G Buttazzo… - Proceedings of the …, 2024 - openaccess.thecvf.com
The robustness of neural networks against input perturbations with bounded magnitude
represents a serious concern in the deployment of deep learning models in safety-critical …

Invertible monotone operators for normalizing flows

B Ahn, C Kim, Y Hong, HJ Kim - Advances in Neural …, 2022 - proceedings.neurips.cc
Normalizing flows model probability distributions by learning invertible transformations that
transfer a simple distribution into complex distributions. Since the architecture of ResNet …