Overview frequency principle/spectral bias in deep learning

ZQJ Xu, Y Zhang, T Luo - Communications on Applied Mathematics and …, 2024 - Springer
Understanding deep learning is increasingly emergent as it penetrates more and more into
industry and science. In recent years, a research line from Fourier analysis sheds light on …

Task arithmetic in the tangent space: Improved editing of pre-trained models

G Ortiz-Jimenez, A Favero… - Advances in Neural …, 2024 - proceedings.neurips.cc
Task arithmetic has recently emerged as a cost-effective and scalable approach to edit pre-
trained models directly in weight space: By adding the fine-tuned weights of different tasks …

A structured dictionary perspective on implicit neural representations

G Yüce, G Ortiz-Jiménez… - Proceedings of the …, 2022 - openaccess.thecvf.com
Implicit neural representations (INRs) have recently emerged as a promising alternative to
classical discretized representations of signals. Nevertheless, despite their practical …

The low-rank simplicity bias in deep networks

M Huh, H Mobahi, R Zhang, B Cheung… - arXiv preprint arXiv …, 2021 - arxiv.org
Modern deep neural networks are highly over-parameterized compared to the data on which
they are trained, yet they often generalize remarkably well. A flurry of recent work has asked …

Neural networks as kernel learners: The silent alignment effect

A Atanasov, B Bordelon, C Pehlevan - arXiv preprint arXiv:2111.00034, 2021 - arxiv.org
Neural networks in the lazy training regime converge to kernel machines. Can neural
networks in the rich feature learning regime learn a kernel machine with a data-dependent …

What can the neural tangent kernel tell us about adversarial robustness?

N Tsilivis, J Kempe - Advances in Neural Information …, 2022 - proceedings.neurips.cc
The adversarial vulnerability of neural nets, and subsequent techniques to create robust
models have attracted significant attention; yet we still lack a full understanding of this …

What can linearized neural networks actually say about generalization?

G Ortiz-Jiménez… - Advances in Neural …, 2021 - proceedings.neurips.cc
For certain infinitely-wide neural networks, the neural tangent kernel (NTK) theory fully
characterizes generalization, but for the networks used in practice, the empirical NTK only …

The influence of learning rule on representation dynamics in wide neural networks

B Bordelon, C Pehlevan - The Eleventh International Conference on …, 2022 - openreview.net
It is unclear how changing the learning rule of a deep neural network alters its learning
dynamics and representations. To gain insight into the relationship between learned …

Spectral evolution and invariance in linear-width neural networks

Z Wang, A Engel, AD Sarwate… - Advances in Neural …, 2024 - proceedings.neurips.cc
We investigate the spectral properties of linear-width feed-forward neural networks, where
the sample size is asymptotically proportional to network width. Empirically, we show that the …

Neural (tangent kernel) collapse

M Seleznova, D Weitzner, R Giryes… - Advances in …, 2024 - proceedings.neurips.cc
This work bridges two important concepts: the Neural Tangent Kernel (NTK), which captures
the evolution of deep neural networks (DNNs) during training, and the Neural Collapse (NC) …