H Ji, C Ding, B Huang, Y Huan… - IEEE Journal on …, 2024 - ieeexplore.ieee.org
Exploding development of convolutional neural network (CNN) benefits greatly from the hardware-based acceleration to maintain low latency and high utilization of resources. To …
MS Ibrahim, M Usman, JA Lee - arXiv preprint arXiv:2412.13724, 2024 - arxiv.org
Convolutional Neural Networks (CNNs) are crucial in various applications, but their deployment on resource-constrained edge devices poses challenges. This study presents …
In the realm of neural network computation, optical neural network accelerators (ONNs) have emerged as a promising solution, leveraging the inherent speed and parallelism of …
X Wu, M Wang, J Lin, Z Wang - IEEE Transactions on Very …, 2024 - ieeexplore.ieee.org
Inspired by the key operation of vision transformers (ViTs), convolutional neural networks (CNNs) have widely adopted arbitrary-kernel convolutions to achieve high performance in …
L He, Y Zhao, R Gao, Y Du, L Du - arXiv preprint arXiv:2407.02913, 2024 - arxiv.org
Fast convolution algorithms, including Winograd and FFT, can efficiently accelerate convolution operations in deep models. However, these algorithms depend on high …
S Chen, Z Liu, W Li, Z Hu, M Zhang, S Cui… - arXiv preprint arXiv …, 2024 - arxiv.org
By introducing the Fermat number transform into chromatic dispersion compensation and adaptive equalization, the computational complexity has been reduced by 68% compared …
Y Huang, J Mai, W Jiang, E Yao - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
Edge computing is a computing framework that offers fewer computing resources compared to cloud computing but brings enterprise applications closer to data sources like Internet of …
Y Hu - Proceedings of the 2023 4th International Conference …, 2023 - dl.acm.org
This paper introduces FFT1d-Conv, a method to accelerate convolution operations in convolutional neural networks. Convolution is one of the most computationally intensive …
Fast convolution algorithms like Winograd and the Fourier transform are well-known for their substantial reduction in the multiplication complexity of Convolutional Neural Networks …