Deep learning algorithms are increasingly employed at the edge. However, edge devices are resource constrained and thus require efficient deployment of deep neural networks …
P Glandorf, T Kaiser… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Sparse neural networks are a key factor in developing resource-efficient machine learning applications. We propose the novel and powerful sparse learning method Adaptive …
R Chen, K Lin, B Hong, S Zhang, F Yang - Heliyon, 2024 - cell.com
In previous research, the prevailing assumption was that Graph Neural Networks (GNNs) precisely depicted the interconnections among nodes within the graph's architecture …
J Xie, Y Zhang, M Lin, L Cao… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Post-training Sparsity (PTS) is a recently emerged avenue that chases efficient network sparsity with limited data in need. Existing PTS methods however undergo significant …
Following their success in natural language processing (NLP), there has been a shift towards transformer models in computer vision. While transformers perform well and offer …
Recent research has focused on weight sparsity in neural network training to reduce FLOPs, aiming for improved efficiency (test accuracy wrt training FLOPs). However, sparse weight …
HM Choi, H Kang, D Oh - International Conference on …, 2023 - proceedings.mlr.press
Compact representation of multimedia signals using implicit neural representations (INRs) has advanced significantly over the past few years, and recent works address their …
P Hu, S Li, L Huang - arXiv preprint arXiv:2408.11746, 2024 - arxiv.org
Large language models (LLMs) have made significant strides in complex tasks, yet their widespread adoption is impeded by substantial computational demands. With hundreds of …
Recent works have explored the use of weight sparsity to improve the training efficiency (test accuracy wrt training FLOPs) of deep neural networks (DNNs). These works aim to reduce …