作者
Aosong Feng, Priyadarshini Panda
发表日期
2020/7/19
研讨会论文
2020 International Joint Conference on Neural Networks (IJCNN)
页码范围
1-7
出版商
IEEE
简介
Deep learning has achieved state-of-the-art accuracies on several computer vision tasks. However, the computational and energy requirements associated with training such deep neural networks can be quite high. In this paper, we propose a cumulative training strategy with Net2Net transformation that achieves training computational efficiency without incurring large accuracy loss, in comparison to a model trained from scratch. We achieve this by first training a small network (with lesser parameters) on a small subset of the original dataset, and then gradually expanding the network using Net2Net transformation to train incrementally on larger subsets of the dataset. This incremental training strategy with Net2Net utilizes function-preserving transformations that transfers knowledge from each previous small network to the next larger network, thereby, reducing the overall training complexity. Our experiments …
引用总数
202120222023202412
学术搜索中的文章