P Glandorf, T Kaiser… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Sparse neural networks are a key factor in developing resource-efficient machine learning applications. We propose the novel and powerful sparse learning method Adaptive …
We introduce a meta-learning algorithm for adversarially robust classification. The proposed method tries to be as model agnostic as possible and optimizes a dataset prior to its …
Distilling knowledge from a large teacher model to a lightweight one is a widely successful approach for generating compact, powerful models in the semi-supervised learning setting …
Neural networks typically exhibit permutation symmetry, as reordering neurons in each layer does not change the underlying function they compute. These symmetries contribute to the …
Machine learning (ML) models are overparameterized to support generality and avoid overfitting. Prior works have shown that these additional parameters can be used for both …
A Azeemi, I Qazi, A Raza - Findings of the Association for …, 2023 - aclanthology.org
Abstract Model pruning methods reduce memory requirements and inference time of large- scale pre-trained language models after deployment. However, the actual pruning …
Pruning is a standard technique for reducing the computational cost of deep networks. Many advances in pruning leverage concepts from the Lottery Ticket Hypothesis (LTH). LTH …
O Ranadive, N Thakurdesai, AS Morcos… - arXiv preprint arXiv …, 2023 - arxiv.org
It is commonly observed that deep networks trained for classification exhibit class-selective neurons in their early and intermediate layers. Intriguingly, recent studies have shown that …
N Bar, R Giryes - arXiv preprint arXiv:2406.01086, 2024 - arxiv.org
Having large amounts of annotated data significantly impacts the effectiveness of deep neural networks. However, the annotation task can be very expensive in some domains …