T Choudhary, V Mishra, A Goswami… - Artificial Intelligence …, 2020 - Springer
In recent years, machine learning (ML) and deep learning (DL) have shown remarkable improvement in computer vision, natural language processing, stock prediction, forecasting …
Deploying large language models (LLMs) is challenging because they are memory inefficient and compute-intensive for practical applications. In reaction, researchers train …
As camera and LiDAR sensors capture complementary information in autonomous driving, great efforts have been made to conduct semantic segmentation through multi-modality data …
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its …
We investigate the design aspects of feature distillation methods achieving network compression and propose a novel feature distillation method in which the distillation loss is …
G Xu, Z Liu, X Li, CC Loy - European conference on computer vision, 2020 - Springer
Abstract Knowledge distillation, which involves extracting the “dark knowledge” from a teacher network to guide the learning of a student network, has emerged as an important …
Q Guo, X Wang, Y Wu, Z Yu, D Liang… - Proceedings of the …, 2020 - openaccess.thecvf.com
This work presents an efficient yet effective online Knowledge Distillation method via Collaborative Learning, termed KDCL, which is able to consistently improve the …
J Park, S Samarakoon, M Bennis… - Proceedings of the …, 2019 - ieeexplore.ieee.org
Fueled by the availability of more data and computing power, recent breakthroughs in cloud- based machine learning (ML) have transformed every aspect of our lives from face …
S Yun, J Park, K Lee, J Shin - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com
Deep neural networks with millions of parameters may suffer from poor generalization due to overfitting. To mitigate the issue, we propose a new regularization method that penalizes the …