[HTML][HTML] 深度神经网络知识蒸馏综述

韩宇 - Computer Science and Application, 2020 - hanspub.org
深度神经网络在计算机视觉, 自然语言处理, 语音识别等多个领域取得了巨大成功,
但是随着网络结构的复杂化, 神经网络模型需要消耗大量的计算资源和存储空间 …

Review of Knowledge Distillation in Convolutional Neural Network Compression.

M Xianfa, LIU Fang, LI Guang… - Journal of Frontiers of …, 2021 - search.ebscohost.com
In recent years, convolutional neural network (CNN) has made remarkable achievements in
many applications in the field of image analysis with its powerful ability of feature extraction …

The compression techniques applied on deep learning model

H He, L Huang, Z Huang, T Yang - Highlights in Science, Engineering …, 2022 - drpress.org
In recent years, the penetration rate of smartphones has gradually completed, artificial
intelligence is the cutting-edge technology that can trigger disruptive changes. Deep …

[PDF][PDF] 知识蒸馏方法研究与应用综述

司兆峰, 齐洪钢 - 中国图象图形学报, 2023 - cjig.cn
随着深度学习方法的不断发展, 其存储代价和计算代价也不断增长, 在资源受限的平台上,
这种情况给其应用带来了挑战. 为了应对这种挑战, 研究者提出了一系列神经网络压缩方法 …

Application of Model Compression Technology Based on Knowledge Distillation in Convolutional Neural Network Lightweight

F Wang, C Pan, J Huang - 2022 China Automation Congress …, 2022 - ieeexplore.ieee.org
Nowadays, the neural networks become deeper and deeper, the number of parameters has
continued to increase and the complexity of model has also continued to increase. These …

Recent research trends on Model Compression and Knowledge Transfer in CNNs

H Xue, K Ren - … on Computer Science, Artificial Intelligence and …, 2021 - ieeexplore.ieee.org
Convolutional neural network (CNN) is the main tool for deep learning and computer vision,
and it has many applications in face recognition, sign language recognition and speech …

Feature distribution-based knowledge distillation for deep neural networks

H Hong, H Kim - 2022 19th International SoC Design …, 2022 - ieeexplore.ieee.org
In recent years. various compression methods and compact models have been actively
proposed to solve the significant computational costs accompanied by the achievement of …

[PDF][PDF] 深度神经网络压缩与加速综述

纪荣嵘, 林绍辉, 晁飞, 吴永坚, 黄飞跃 - 计算机研究与发展, 2018 - core.ac.uk
深度神经网络压缩与加速综述 Page 1 计算机研究与发展 DOI:10.7544?issn1000-1239.2018.20180129
JournalofComputerResearchandDevelopment 55(9):1871-1888,2018 收稿日期:2018-02-21;修 …

深度神经网络压缩与加速综述

胡浩麟, 林向伟 - 信号处理, 2022 - scholars.cityu.edu.hk
近年来, 随着图形处理器性能的飞速提升, 深度神经网络取得了巨大的发展成就,
在许多人工智能任务中屡创佳绩. 然而, 主流的深度学习网络模型由于存在计算复杂度高 …

Parallel blockwise knowledge distillation for deep neural network compression

C Blakeney, X Li, Y Yan, Z Zong - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Deep neural networks (DNNs) have been extremely successful in solving many challenging
AI tasks in natural language processing, speech recognition, and computer vision …