作者
Robert Legenstein, Christian Naeger, Wolfgang Maass
发表日期
2005/11/1
期刊
Neural computation
卷号
17
期号
11
页码范围
2337-2382
出版商
MIT Press
简介
Spiking neurons are very flexible computational modules, which can implement with different values of their adjustable synaptic parameters an enormous variety of different transformations F from input spike trains to output spike trains. We examine in this letter the question to what extent a spiking neuron with biologically realistic models for dynamic synapses can be taught via spike-timing-dependent plasticity (STDP) to implement a given transformation F. We consider a supervised learning paradigm where during training, the output of the neuron is clamped to the target signal (teacher forcing). The well-known perceptron convergence theorem asserts the convergence of a simple supervised learning algorithm for drastically simplified neuron models (McCulloch-Pitts neurons). We show that in contrast to the perceptron convergence theorem, no theoretical guarantee can be given for the convergence of STDP …
引用总数
2004200520062007200820092010201120122013201420152016201720182019202020212022202335202011131813182517231114195149118
学术搜索中的文章