作者
Zhongrui Wang, Can Li, Peng Lin, Mingyi Rao, Yongyang Nie, Wenhao Song, Qinru Qiu, Yunning Li, Peng Yan, John Paul Strachan, Ning Ge, Nathan McDonald, Qing Wu, Miao Hu, Huaqiang Wu, R Stanley Williams, Qiangfei Xia, J Joshua Yang
发表日期
2019/9
期刊
Nature Machine Intelligence
卷号
1
期号
9
页码范围
434-442
出版商
Nature Publishing Group UK
简介
The explosive growth of machine learning is largely due to the recent advancements in hardware and architecture. The engineering of network structures, taking advantage of the spatial or temporal translational isometry of patterns, naturally leads to bio-inspired, shared-weight structures such as convolutional neural networks, which have markedly reduced the number of free parameters. State-of-the-art microarchitectures commonly rely on weight-sharing techniques, but still suffer from the von Neumann bottleneck of transistor-based platforms. Here, we experimentally demonstrate the in situ training of a five-level convolutional neural network that self-adapts to non-idealities of the one-transistor one-memristor array to classify the MNIST dataset, achieving similar accuracy to the memristor-based multilayer perceptron with a reduction in trainable parameters of ~75% owing to the shared weights. In addition, the …
引用总数
20192020202120222023202423144586239
学术搜索中的文章
Z Wang, C Li, P Lin, M Rao, Y Nie, W Song, Q Qiu, Y Li… - Nature Machine Intelligence, 2019