Y Liu, D Cheng, D Zhang, S Xu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Capsule networks (CapsNets) have been known difficult to develop a deeper architecture, which is desirable for high performance in the deep learning era, due to the complex …
J Chang, Y Lu, P Xue, Y Xu, Z Wei - Knowledge-Based Systems, 2023 - Elsevier
Convolutional neural networks (CNNs) have shown excellent performance in numerous computer vision tasks. However, the high computational and memory demands in computer …
Y Zhang, NM Freris - IEEE Transactions on Neural Networks …, 2023 - ieeexplore.ieee.org
Filter pruning is advocated for accelerating deep neural networks without dedicated hardware or libraries, while maintaining high prediction accuracy. Several works have cast …
Neural architecture search (NAS) and network pruning are widely studied efficient AI techniques, but not yet perfect. NAS performs exhaustive candidate architecture search …
Channel pruning has been widely studied as a prevailing method that effectively reduces both computational cost and memory footprint of the original network while keeping a …
With the ever-increasing popularity of edge devices, it is necessary to implement real-time segmentation on the edge for autonomous driving and many other applications. Vision …
W Hu, Z Che, N Liu, M Li, J Tang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Deep convolutional neural networks are shown to be overkill with high parametric and computational redundancy in many application scenarios, and an increasing number of …
Abstract Dynamic Neural Networks (DNNs) are an evolving research field within deep learning (DL), offering a robust, adaptable, and efficient alternative to the conventional Static …
Deep learning technologies have demonstrated remarkable effectiveness in a wide range of tasks, and deep learning holds the potential to advance a multitude of applications …