It has been observed that residual networks can be viewed as the explicit Euler discretization of an Ordinary Differential Equation (ODE). This observation motivated the …
Recent work has attempted to interpret residual networks (ResNets) as one step of a forward Euler discretization of an ordinary differential equation, focusing mainly on syntactic …
Y Lu, A Zhong, Q Li, B Dong - International Conference on …, 2018 - proceedings.mlr.press
Deep neural networks have become the state-of-the-art models in numerous machine learning tasks. However, general guidance to network architecture design is still missing. In …
A Pal, Y Ma, V Shah… - … Conference on Machine …, 2021 - proceedings.mlr.press
Democratization of machine learning requires architectures that automatically adapt to new problems. Neural Differential Equations (NDEs) have emerged as a popular modeling …
Abstract We show that Neural Ordinary Differential Equations (ODEs) learn representations that preserve the topology of the input space and prove that this implies the existence of …
B Tzen, M Raginsky - arXiv preprint arXiv:1905.09883, 2019 - arxiv.org
In deep latent Gaussian models, the latent variable is generated by a time-inhomogeneous Markov chain, where at each time step we pass the current state through a parametric …
H Xia, V Suliafu, H Ji, T Nguyen… - Advances in …, 2021 - proceedings.neurips.cc
We propose heavy ball neural ordinary differential equations (HBNODEs), leveraging the continuous limit of the classical momentum accelerated gradient descent, to improve neural …
A Ghosh, H Behl, E Dupont, P Torr… - Advances in Neural …, 2020 - proceedings.neurips.cc
Abstract Training Neural Ordinary Differential Equations (ODEs) is often computationally expensive. Indeed, computing the forward pass of such models involves solving an ODE …
We introduce physics informed neural networks--neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general …