作者
Jeffrey Beck, Vikranth R Bejjanki, Alexandre Pouget
发表日期
2011/6/1
期刊
Neural computation
卷号
23
期号
6
页码范围
1484-1502
出版商
MIT Press
简介
A simple expression for a lower bound of Fisher information is derived for a network of recurrently connected spiking neurons that have been driven to a noise-perturbed steady state. We call this lower bound linear Fisher information, as it corresponds to the Fisher information that can be recovered by a locally optimal linear estimator. Unlike recent similar calculations, the approach used here includes the effects of nonlinear gain functions and correlated input noise and yields a surprisingly simple and intuitive expression that offers substantial insight into the sources of information degradation across successive layers of a neural network. Here, this expression is used to (1) compute the optimal (i.e., information-maximizing) firing rate of a neuron, (2) demonstrate why sharpening tuning curves by either thresholding or the action of recurrent connectivity is generally a bad idea, (3) show how a single cortical …
引用总数
2011201220132014201520162017201820192020202120222023202444476674246834