Abstract Spiking Neural Networks (SNNs) have recently emerged as a new generation of low-power deep neural networks, which is suitable to be implemented on low-power …
This paper studies the curious phenomenon for machine learning models with Transformer architectures that their activation maps are sparse. By activation map we refer to the …
S Wang, Y Sui, J Wu, Z Zheng, H Xiong - Proceedings of the 17th ACM …, 2024 - dl.acm.org
In the realm of deep learning-based recommendation systems, the increasing computational demands, driven by the growing number of users and items, pose a significant challenge to …
P Glandorf, T Kaiser… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Sparse neural networks are a key factor in developing resource-efficient machine learning applications. We propose the novel and powerful sparse learning method Adaptive …
S Liu, Z Wang - arXiv preprint arXiv:2302.02596, 2023 - arxiv.org
This article does not propose any novel algorithm or new hardware for sparsity. Instead, it aims to serve the" common good" for the increasingly prosperous Sparse Neural Network …
The rapid development of large-scale deep learning models questions the affordability of hardware platforms, which necessitates the pruning to reduce their computational and …
C Li, Q Qiu, Z Zhang, J Guo, X Cheng - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Although increasing model size can enhance the adversarial robustness of deep neural networks, in resource-constrained environments, there exist critical sparsity constraints …
Tomorrow's robots will need to distinguish useful information from noise when performing different tasks. A household robot for instance may continuously receive a plethora of …
Deep neural networks (DNNs) in the wireless communication domain have been shown to be hardly generalizable to scenarios where the train and test datasets follow a different …