The recent striking success of deep neural networks in machine learning raises profound questions about the theoretical principles underlying their success. For example, what can …
We perform an analysis of the average generalization dynamics of large neural networks trained using gradient descent. We study the practically-relevant “high-dimensional” regime …
A theoretical understanding of generalization remains an open problem for many machine learning models, including deep networks where overparameterization leads to better …
We consider the problem of learning a target function corresponding to a deep, extensive- width, non-linear neural network with random Gaussian weights. We consider the asymptotic …
Deep neural networks achieve stellar generalisation even when they have enough parameters to easily fit all their training data. We study this phenomenon by analysing the …
The series aims to: present current and emerging innovations in GIScience; describe new and robust GIScience methods for use in transdisciplinary problem solving and decision …
Memorization and generalization are complementary cognitive processes that jointly promote adaptive behavior. For example, animals should memorize safe routes to specific …
J Pennington, Y Bahri - International conference on machine …, 2017 - proceedings.mlr.press
Understanding the geometry of neural network loss surfaces is important for the development of improved optimization algorithms and for building a theoretical …
T Liang, P Sur - The Annals of Statistics, 2022 - projecteuclid.org
A precise high-dimensional asymptotic theory for boosting and minimum-l1-norm interpolated classifiers Page 1 The Annals of Statistics 2022, Vol. 50, No. 3, 1669–1695 …