Robust training under label noise by over-parameterization

S Liu, Z Zhu, Q Qu, C You - International Conference on …, 2022 - proceedings.mlr.press
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …

Exploring lottery ticket hypothesis in spiking neural networks

Y Kim, Y Li, H Park, Y Venkatesha, R Yin… - European Conference on …, 2022 - Springer
Abstract Spiking Neural Networks (SNNs) have recently emerged as a new generation of
low-power deep neural networks, which is suitable to be implemented on low-power …

The lazy neuron phenomenon: On emergence of activation sparsity in transformers

Z Li, C You, S Bhojanapalli, D Li, AS Rawat… - arXiv preprint arXiv …, 2022 - arxiv.org
This paper studies the curious phenomenon for machine learning models with Transformer
architectures that their activation maps are sparse. By activation map we refer to the …

Dynamic Sparse Learning: A Novel Paradigm for Efficient Recommendation

S Wang, Y Sui, J Wu, Z Zheng, H Xiong - Proceedings of the 17th ACM …, 2024 - dl.acm.org
In the realm of deep learning-based recommendation systems, the increasing computational
demands, driven by the growing number of users and items, pose a significant challenge to …

HyperSparse Neural Networks: Shifting Exploration to Exploitation through Adaptive Regularization

P Glandorf, T Kaiser… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Sparse neural networks are a key factor in developing resource-efficient machine learning
applications. We propose the novel and powerful sparse learning method Adaptive …

Ten lessons we have learned in the new" sparseland": A short handbook for sparse neural network researchers

S Liu, Z Wang - arXiv preprint arXiv:2302.02596, 2023 - arxiv.org
This article does not propose any novel algorithm or new hardware for sparsity. Instead, it
aims to serve the" common good" for the increasingly prosperous Sparse Neural Network …

Visual prompting upgrades neural network sparsification: A data-model perspective

C Jin, T Huang, Y Zhang, M Pechenizkiy, S Liu… - arXiv preprint arXiv …, 2023 - arxiv.org
The rapid development of large-scale deep learning models questions the affordability of
hardware platforms, which necessitates the pruning to reduce their computational and …

Learning adversarially robust sparse networks via weight reparameterization

C Li, Q Qiu, Z Zhang, J Guo, X Cheng - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Although increasing model size can enhance the adversarial robustness of deep neural
networks, in resource-constrained environments, there exist critical sparsity constraints …

Automatic noise filtering with dynamic sparse training in deep reinforcement learning

B Grooten, G Sokar, S Dohare, E Mocanu… - arXiv preprint arXiv …, 2023 - arxiv.org
Tomorrow's robots will need to distinguish useful information from noise when performing
different tasks. A household robot for instance may continuously receive a plethora of …

PARAMOUNT: Towards generalizable deeP leARning for mmwAve beaM selectiOn using sUb-6GHz chaNnel measuremenTs

K Vuckovic, MB Mashhadi, F Hejazi… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Deep neural networks (DNNs) in the wireless communication domain have been shown to
be hardly generalizable to scenarios where the train and test datasets follow a different …