Generative learning for nonlinear dynamics

W Gilpin - Nature Reviews Physics, 2024 - nature.com
Modern generative machine learning models are able to create realistic outputs far beyond
their training data, such as photorealistic artwork, accurate protein structures or …

Weight fluctuations in deep linear neural networks and a derivation of the inverse-variance flatness relation

M Gross, AP Raulf, C Räth - Physical Review Research, 2024 - APS
We investigate the stationary (late-time) training regime of single-and two-layer
underparameterized linear neural networks within the continuum limit of stochastic gradient …

Stochastic Thermodynamics of Learning Parametric Probabilistic Models

SS Parsi - Entropy, 2024 - mdpi.com
We have formulated a family of machine learning problems as the time evolution of
parametric probabilistic models (PPMs), inherently rendering a thermodynamic process. Our …

On Networks and their Applications: Stability of Gene Regulatory Networks and Gene Function Prediction using Autoencoders

H Coban - arXiv preprint arXiv:2408.07064, 2024 - arxiv.org
We prove that nested canalizing functions are the minimum-sensitivity Boolean functions for
any activity ratio and we determine the functional form of this boundary which has a …

Minibatch training of neural network ensembles via trajectory sampling

JF Mair, L Causer, JP Garrahan - arXiv preprint arXiv:2306.13442, 2023 - arxiv.org
Most iterative neural network training methods use estimates of the loss function over small
random subsets (or minibatches) of the data to update the parameters, which aid in …

Fokker-Planck to Callan-Symanzik: evolution of weight matrices under training

W Bu, U Kol, Z Liu - arXiv preprint arXiv:2501.09659, 2025 - arxiv.org
The dynamical evolution of a neural network during training has been an incredibly
fascinating subject of study. First principal derivation of generic evolution of variables in …