Differential equations in data analysis

I Dattner - Wiley Interdisciplinary Reviews: Computational …, 2021 - Wiley Online Library
Differential equations have proven to be a powerful mathematical tool in science and
engineering, leading to better understanding, prediction, and control of dynamic processes …

Optimization with momentum: Dynamical, control-theoretic, and symplectic perspectives

M Muehlebach, MI Jordan - Journal of Machine Learning Research, 2021 - jmlr.org
We analyze the convergence rate of various momentum-based optimization algorithms from
a dynamical systems point of view. Our analysis exploits fundamental topological properties …

Continuous-in-depth neural networks

AF Queiruga, NB Erichson, D Taylor… - arXiv preprint arXiv …, 2020 - arxiv.org
Recent work has attempted to interpret residual networks (ResNets) as one step of a forward
Euler discretization of an ordinary differential equation, focusing mainly on syntactic …

Generalization of the gradient method with fractional order gradient direction

Y Wei, Y Kang, W Yin, Y Wang - Journal of the Franklin Institute, 2020 - Elsevier
Fractional calculus is an efficient tool, which has the potential to improve the performance of
gradient methods. However, when the first order gradient direction is generalized by …

Continuous-time analysis of accelerated gradient methods via conservation laws in dilated coordinate systems

JJ Suh, G Roh, EK Ryu - International Conference on …, 2022 - proceedings.mlr.press
We analyze continuous-time models of accelerated gradient methods through deriving
conservation laws in dilated coordinate systems. Namely, instead of analyzing the dynamics …

From the Ravine method to the Nesterov method and vice versa: a dynamical system perspective

H Attouch, J Fadili - SIAM Journal on Optimization, 2022 - SIAM
We revisit the Ravine method of Gelfand and Tsetlin from a dynamical system perspective,
study its convergence properties, and highlight its similarities and differences with the …

Generalized momentum-based methods: A Hamiltonian perspective

J Diakonikolas, MI Jordan - SIAM Journal on Optimization, 2021 - SIAM
We take a Hamiltonian-based perspective to generalize Nesterov's accelerated gradient
descent and Polyak's heavy ball method to a broad class of momentum methods in the …

The connections between Lyapunov functions for some optimization algorithms and differential equations

JM Sanz Serna, KC Zygalakis - SIAM Journal on Numerical Analysis, 2021 - SIAM
In this manuscript we study the properties of a family of a second-order differential equations
with damping, its discretizations, and their connections with accelerated optimization …

On dissipative symplectic integration with applications to gradient-based optimization

G França, MI Jordan, R Vidal - Journal of Statistical Mechanics …, 2021 - iopscience.iop.org
Recently, continuous-time dynamical systems have proved useful in providing conceptual
and quantitative insights into gradient-based optimization, widely used in modern machine …

Primal–dual methods for large-scale and distributed convex optimization and data analytics

D Jakovetić, D Bajović, J Xavier… - Proceedings of the …, 2020 - ieeexplore.ieee.org
The augmented Lagrangian method (ALM) is a classical optimization tool that solves a given
“difficult”(constrained) problem via finding solutions of a sequence of “easier”(often …