While neural plasticity has long been studied as the basis of learning, the growth of large- scale neural recording techniques provides a unique opportunity to study how learning …
In this paper, we conduct an empirical study of the feature learning process in deep classifiers. Recent research has identified a training phenomenon called Neural Collapse …
C Yaras, P Wang, Z Zhu… - Advances in neural …, 2022 - proceedings.neurips.cc
When training overparameterized deep networks for classification tasks, it has been widely observed that the learned features exhibit a so-called" neural collapse'" phenomenon. More …
B Sorscher, S Ganguli… - Proceedings of the …, 2022 - National Acad Sciences
Understanding the neural basis of the remarkable human cognitive capacity to learn novel concepts from just one or a few sensory experiences constitutes a fundamental problem. We …
Neural networks need the right representations of input data to learn. Here we ask how gradient-based learning shapes a fundamental property of representations in recurrent …
Geometric descriptions of deep neural networks (DNNs) have the potential to uncover core representational principles of computational models in neuroscience. Here we examined the …
Deep learning algorithms are responsible for a technological revolution in a variety of tasks including image recognition or Go playing. Yet, why they work is not understood. Ultimately …
Deep nonparametric regression on approximate manifolds: Nonasymptotic error bounds with polynomial prefactors Page 1 The Annals of Statistics 2023, Vol. 51, No. 2, 691–716 …
T Ito, JD Murray - Nature neuroscience, 2023 - nature.com
Human cognition recruits distributed neural processes, yet the organizing computational and functional architectures remain unclear. Here, we characterized the geometry and …