Task arithmetic has recently emerged as a cost-effective and scalable approach to edit pre- trained models directly in weight space: By adding the fine-tuned weights of different tasks …
Implicit neural representations (INRs) have recently emerged as a promising alternative to classical discretized representations of signals. Nevertheless, despite their practical …
We analytically investigate how over-parameterization of models in randomized machine learning algorithms impacts the information leakage about their training data. Specifically …
We study the effect of width on the dynamics of feature-learning neural networks across a variety of architectures and datasets. Early in training, wide neural networks trained on …
N Tsilivis, J Kempe - Advances in Neural Information …, 2022 - proceedings.neurips.cc
The adversarial vulnerability of neural nets, and subsequent techniques to create robust models have attracted significant attention; yet we still lack a full understanding of this …
The``Neural Tangent Kernel''(NTK)(Jacot et al 2018), and its empirical variants have been proposed as a proxy to capture certain behaviors of real neural networks. In this work, we …
We investigate the spectral properties of linear-width feed-forward neural networks, where the sample size is asymptotically proportional to network width. Empirically, we show that the …
L Petrini, F Cagnetta… - Advances in Neural …, 2022 - proceedings.neurips.cc
It is widely believed that the success of deep networks lies in their ability to learn a meaningful representation of the features of the data. Yet, understanding when and how this …
Two key challenges facing modern deep learning is mitigating deep networks vulnerability to adversarial attacks, and understanding deep learning's generalization capabilities …