作者
Yuhang Song, Thomas Lukasiewicz, Zhenghua Xu, Rafal Bogacz
发表日期
2020
期刊
Advances in neural information processing systems
卷号
33
页码范围
22566-22579
简介
Backpropagation (BP) has been the most successful algorithm used to train artificial neural networks. However, there are several gaps between BP and learning in biologically plausible neuronal networks of the brain (learning in the brain, or simply BL, for short), in particular,(1) it has been unclear to date, if BP can be implemented exactly via BL,(2) there is a lack of local plasticity in BP, ie, weight updates require information that is not locally available, while BL utilizes only locally available information, and (3)~ there is a lack of autonomy in BP, ie, some external control over the neural network is required (eg, switching between prediction and learning stages requires changes to dynamics and synaptic plasticity rules), while BL works fully autonomously. Bridging such gaps, ie, understanding how BP can be approximated by BL, has been of major interest in both neuroscience and machine learning. Despite tremendous efforts, however, no previous model has bridged the gaps at a degree of demonstrating an equivalence to BP, instead, only approximations to BP have been shown. Here, we present for the first time a framework within BL that bridges the above crucial gaps. We propose a BL model that (1) produces\emph {exactly the same} updates of the neural weights as~ BP, while (2)~ employing local plasticity, ie, all neurons perform only local computations, done simultaneously. We then modify it to an alternative BL model that (3) also works fully autonomously. Overall, our work provides important evidence for the debate on the long-disputed question whether the brain can perform~ BP.
引用总数
学术搜索中的文章