Constraining Chaos: Enforcing dynamical invariants in the training of recurrent neural networks

JA Platt, SG Penny, TA Smith, TC Chen… - arXiv preprint arXiv …, 2023 - arxiv.org
arXiv preprint arXiv:2304.12865, 2023arxiv.org
Drawing on ergodic theory, we introduce a novel training method for machine learning
based forecasting methods for chaotic dynamical systems. The training enforces dynamical
invariants--such as the Lyapunov exponent spectrum and fractal dimension--in the systems
of interest, enabling longer and more stable forecasts when operating with limited data. The
technique is demonstrated in detail using the recurrent neural network architecture of
reservoir computing. Results are given for the Lorenz 1996 chaotic dynamical system and a …
Drawing on ergodic theory, we introduce a novel training method for machine learning based forecasting methods for chaotic dynamical systems. The training enforces dynamical invariants--such as the Lyapunov exponent spectrum and fractal dimension--in the systems of interest, enabling longer and more stable forecasts when operating with limited data. The technique is demonstrated in detail using the recurrent neural network architecture of reservoir computing. Results are given for the Lorenz 1996 chaotic dynamical system and a spectral quasi-geostrophic model, both typical test cases for numerical weather prediction.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果