R Wu, V Papyan - arXiv preprint arXiv:2405.17767, 2024 - arxiv.org
Neural collapse ($\mathcal {NC} $) is a phenomenon observed in classification tasks where top-layer representations collapse into their class means, which become equinorm …
Deep neural networks (DNNs) at convergence consistently represent the training data in the last layer via a highly symmetric geometric structure referred to as neural collapse. This …
W Hong, S Ling - arXiv preprint arXiv:2309.09725, 2023 - arxiv.org
Recent years have witnessed the huge success of deep neural networks (DNNs) in various tasks of computer vision and text processing. Interestingly, these DNNs with massive …
W Hong, S Ling - Journal of Machine Learning Research, 2024 - jmlr.org
Neural Collapse (NC) is a fascinating phenomenon that arises during the terminal phase of training (TPT) of deep neural networks (DNNs). Specifically, for balanced training datasets …
In order to better understand feature learning in neural networks, we propose and study linear models in tangent feature space where the features are allowed to be transformed …
Deep neural networks (DNNs) exhibit a surprising structure in their final layer known as neural collapse (NC), and a growing body of works has currently investigated the …
Y Nam, C Mingard, SH Lee, S Hayou… - arXiv preprint arXiv …, 2024 - arxiv.org
Deep neural networks (DNNs) exhibit a remarkable ability to automatically learn data representations, finding appropriate features without human input. Here we present a …
Contemporary machine learning algorithms train artificial neural networks by setting network weights to a single optimized configuration through gradient descent on task-specific …
The widespread use of Deep Neural Networks (DNNs) in various application has underscored their effectiveness, yet the fundamental principles behind their success largely …