Task arithmetic has recently emerged as a cost-effective and scalable approach to edit pre- trained models directly in weight space: By adding the fine-tuned weights of different tasks …
Implicit neural representations (INRs) have recently emerged as a promising alternative to classical discretized representations of signals. Nevertheless, despite their practical …
Modern deep neural networks are highly over-parameterized compared to the data on which they are trained, yet they often generalize remarkably well. A flurry of recent work has asked …
Neural networks in the lazy training regime converge to kernel machines. Can neural networks in the rich feature learning regime learn a kernel machine with a data-dependent …
N Tsilivis, J Kempe - Advances in Neural Information …, 2022 - proceedings.neurips.cc
The adversarial vulnerability of neural nets, and subsequent techniques to create robust models have attracted significant attention; yet we still lack a full understanding of this …
G Ortiz-Jiménez… - Advances in Neural …, 2021 - proceedings.neurips.cc
For certain infinitely-wide neural networks, the neural tangent kernel (NTK) theory fully characterizes generalization, but for the networks used in practice, the empirical NTK only …
B Bordelon, C Pehlevan - The Eleventh International Conference on …, 2022 - openreview.net
It is unclear how changing the learning rule of a deep neural network alters its learning dynamics and representations. To gain insight into the relationship between learned …
We investigate the spectral properties of linear-width feed-forward neural networks, where the sample size is asymptotically proportional to network width. Empirically, we show that the …
This work bridges two important concepts: the Neural Tangent Kernel (NTK), which captures the evolution of deep neural networks (DNNs) during training, and the Neural Collapse (NC) …