This paper builds bridges between two families of probabilistic algorithms:(hierarchical) variational inference (VI), which is typically used to model distributions over continuous …
The recently proposed generative flow networks (GFlowNets) are a method of training a policy to sample compositional discrete objects with probabilities proportional to a given …
L Pan, N Malkin, D Zhang… - … Conference on Machine …, 2023 - proceedings.mlr.press
Abstract Generative Flow Networks or GFlowNets are related to Monte-Carlo Markov chain methods (as they sample from a distribution specified by an energy function), reinforcement …
High training costs of generative models and the need to fine-tune them for specific tasks have created a strong interest in model reuse and composition. A key challenge in …
Abstract Generative Flow Networks (or GFlowNets for short) are a family of probabilistic agents that learn to sample complex combinatorial structures through the lens of “inference …
N Malkin, M Jain, E Bengio, C Sun… - Advances in Neural …, 2022 - proceedings.neurips.cc
Generative flow networks (GFlowNets) are a method for learning a stochastic policy for generating compositional objects, such as graphs or strings, from a given unnormalized …
There are many frameworks for deep generative modeling, each often presented with their own specific training algorithms and inference methods. Here, we demonstrate the …
D Zhang, N Malkin, Z Liu, A Volokhova… - International …, 2022 - proceedings.mlr.press
We present energy-based generative flow networks (EB-GFN), a novel probabilistic modeling algorithm for high-dimensional discrete data. Building upon the theory of …
Flow-based generative models parameterize probability distributions through an invertible transformation and can be trained by maximum likelihood. Invertible residual networks …