Meta-learning biologically plausible semi-supervised update rules

K Gu, S Greydanus, L Metz, N Maheswaranathan… - bioRxiv, 2019 - biorxiv.org
bioRxiv, 2019biorxiv.org
The question of how neurons embedded in a network update their synaptic weights to
collectively achieve behavioral goals is a longstanding problem in systems neuroscience.
Since Hebb's hypothesis that cells that fire together strengthen their connections, cellular
studies have shed light on potential synaptic mechanisms underlying learning. These
mechanisms have directly driven the careful hand design of biologically plausible models of
learning and memory in computational neuroscience. However, these hand designed rules …
Abstract
The question of how neurons embedded in a network update their synaptic weights to collectively achieve behavioral goals is a longstanding problem in systems neuroscience. Since Hebb’s hypothesis that cells that fire together strengthen their connections, cellular studies have shed light on potential synaptic mechanisms underlying learning. These mechanisms have directly driven the careful hand design of biologically plausible models of learning and memory in computational neuroscience . However, these hand designed rules have yet to achieve satisfying success training large neural networks, and are dramatically outperformed by biologically implausible approaches such as backprop. We propose an alternative paradigm for designing biologically plausible learning rules: using meta-learning to learn a parametric synaptic update rule which is capable of training deep networks. We demonstrate this approach by meta-learning an update rule for semi-supervised tasks, where sparse labels are provided to a deep network but the majority of inputs are unlabeled. The meta-learned plasticity rule integrates bottom-up, top-down, and recurrent inputs to each neuron, and generates weight updates as the product of pre- and post- synaptic neuronal outputs. The way in which the inputs to each neuron are combined to produce a learning signal, however, is itself a meta-learned function, parameterized by a neural network. Critically, the meta-learned update rule integrates only neuron-local information when proposing updates–that is, our learning rule is spatially localized to individual neurons. After meta-learning, the resulting synaptic update rule is capable of driving task-relevant learning for semi-supervised tasks. We demonstrate this capability on two simple classification problems. In general, we believe meta-learning to be a powerful approach to finding more effective synaptic plasticity rules, which will motivate new hypotheses for biological neural networks, and new algorithms for artificial neural networks.
biorxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果