R Mishra, H Gupta - ACM Computing Surveys, 2023 - dl.acm.org
Deep Neural Networks (DNNs) have gained unprecedented popularity due to their high- order performance and automated feature extraction capability. This has encouraged …
Abstract Recently, Vision Transformer (ViT) has continuously established new milestones in the computer vision field, while the high computation and memory cost makes its …
Recently, a new trend of exploring sparsity for accelerating neural network training has emerged, embracing the paradigm of training on the edge. This paper proposes a novel …
Channel pruning has been broadly recognized as an effective technique to reduce the computation and memory cost of deep convolutional neural networks. However …
G Yuan, P Behnam, Z Li, A Shafiee… - 2021 ACM/IEEE 48th …, 2021 - ieeexplore.ieee.org
Recent work demonstrated the promise of using resistive random access memory (ReRAM) as an emerging technology to perform inherently parallel analog domain in-situ matrix …
W Xu, W Fang, Y Ding, M Zou, N Xiong - IEEE Access, 2021 - ieeexplore.ieee.org
The ever-increasing number of Internet of Things (IoT) devices are continuously generating huge masses of data, but the current cloud-centric approach for IoT big data analysis has …
Machine learning (ML) models are widely used in many important domains. For efficiently processing these computational-and memory-intensive applications, tensors of these …
Q Jin, J Ren, OJ Woodford, J Wang… - Proceedings of the …, 2021 - openaccess.thecvf.com
Abstract Generative Adversarial Networks (GANs) have achieved huge success in generating high-fidelity images, however, they suffer from low efficiency due to tremendous …
J Guo, D Xu, W Ouyang - IEEE Transactions on Neural …, 2023 - ieeexplore.ieee.org
Observing that the existing model compression approaches only focus on reducing the redundancies in convolutional neural networks (CNNs) along one particular dimension (eg …