作者
Daniel Bankman, Lita Yang, Bert Moons, Marian Verhelst, Boris Murmann
发表日期
2018/10/3
期刊
IEEE Journal of Solid-State Circuits
卷号
54
期号
1
页码范围
158-172
出版商
IEEE
简介
The trend of pushing inference from cloud to edge due to concerns of latency, bandwidth, and privacy has created demand for energy-efficient neural network hardware. This paper presents a mixed-signal binary convolutional neural network (CNN) processor for always-on inference applications that achieves 3.8 μJ/classification at 86% accuracy on the CIFAR-10 image classification data set. The goal of this paper is to establish the minimum-energy point for the representative CIFAR-10 inference task, using the available design tradeoffs. The BinaryNet algorithm for training neural networks with weights and activations constrained to +1 and -1 drastically simplifies multiplications to XNOR and allows integrating all memory on-chip. A weight-stationary, data-parallel architecture with input reuse amortizes memory access across many computations, leaving wide vector summation as the remaining energy bottleneck …
引用总数
201820192020202120222023202424625662525619
学术搜索中的文章