Metastable dynamics of neural circuits and networks

BAW Brinkman, H Yan, A Maffei, IM Park… - Applied Physics …, 2022 - pubs.aip.org
Cortical neurons emit seemingly erratic trains of action potentials or “spikes,” and neural
network dynamics emerge from the coordinated spiking activity within neural circuits. These …

A new approach for the vanishing gradient problem on sigmoid activation

M Roodschild, J Gotay Sardiñas, A Will - Progress in Artificial Intelligence, 2020 - Springer
The vanishing gradient problem (VGP) is an important issue at training time on multilayer
neural networks using the backpropagation algorithm. This problem is worse when sigmoid …

Forecasting sequential data using consistent koopman autoencoders

O Azencot, NB Erichson, V Lin… - … on Machine Learning, 2020 - proceedings.mlr.press
Recurrent neural networks are widely used on time series data, yet such models often
ignore the underlying physical structures in such sequences. A new class of physics-based …

Coupled oscillatory recurrent neural network (cornn): An accurate and (gradient) stable architecture for learning long time dependencies

TK Rusch, S Mishra - arXiv preprint arXiv:2010.00951, 2020 - arxiv.org
Circuits of biological neurons, such as in the functional parts of the brain can be modeled as
networks of coupled oscillators. Inspired by the ability of these systems to express a rich set …

Lipschitz recurrent neural networks

NB Erichson, O Azencot, A Queiruga… - arXiv preprint arXiv …, 2020 - arxiv.org
Viewing recurrent neural networks (RNNs) as continuous-time dynamical systems, we
propose a recurrent unit that describes the hidden state's evolution with two parts: a well …

Unicornn: A recurrent model for learning very long time dependencies

TK Rusch, S Mishra - International Conference on Machine …, 2021 - proceedings.mlr.press
The design of recurrent neural networks (RNNs) to accurately process sequential inputs with
long-time dependencies is very challenging on account of the exploding and vanishing …

Beyond exploding and vanishing gradients: analysing RNN training using attractors and smoothness

AH Ribeiro, K Tiels, LA Aguirre… - … conference on artificial …, 2020 - proceedings.mlr.press
The exploding and vanishing gradient problem has been the major conceptual principle
behind most architecture and training improvements in recurrent neural networks (RNNs) …

Learning better with Dale's Law: A spectral perspective

P Li, J Cornford, A Ghosh… - Advances in Neural …, 2024 - proceedings.neurips.cc
Most recurrent neural networks (RNNs) do not include a fundamental constraint of real
neural circuits: Dale's Law, which implies that neurons must be excitatory (E) or inhibitory (I) …

Training recurrent neural networks via forward propagation through time

A Kag, V Saligrama - International Conference on Machine …, 2021 - proceedings.mlr.press
Back-propagation through time (BPTT) has been widely used for training Recurrent Neural
Networks (RNNs). BPTT updates RNN parameters on an instance by back-propagating the …

Physics-informed machine learning for modeling and control of dynamical systems

TX Nghiem, J Drgoňa, C Jones, Z Nagy… - 2023 American …, 2023 - ieeexplore.ieee.org
Physics-informed machine learning (PIML) is a set of methods and tools that systematically
integrate machine learning (ML) algorithms with physical constraints and abstract …