STDP-compatible approximation of backpropagation in an energy-based model

Y Bengio, T Mesnard, A Fischer, S Zhang… - Neural …, 2017 - ieeexplore.ieee.org
We show that Langevin Markov chain Monte Carlo inference in an energy-based model with
latent variables has the property that the early steps of inference, starting from a stationary …

Early inference in energy-based models approximates back-propagation

Y Bengio, A Fischer - arXiv preprint arXiv:1510.02777, 2015 - arxiv.org
We show that Langevin MCMC inference in an energy-based model with latent variables
has the property that the early steps of inference, starting from a stationary point, correspond …

Equilibrium propagation: Bridging the gap between energy-based models and backpropagation

B Scellier, Y Bengio - Frontiers in computational neuroscience, 2017 - frontiersin.org
We introduce Equilibrium Propagation, a learning framework for energy-based models. It
involves only one kind of neural computation, performed in both the first phase (when the …

Learning stable, regularised latent models of neural population dynamics

L Buesing, JH Macke, M Sahani - Network: Computation in Neural …, 2012 - Taylor & Francis
Ongoing advances in experimental technique are making commonplace simultaneous
recordings of the activity of tens to hundreds of cortical neurons at high temporal resolution …

Backpropagation at the infinitesimal inference limit of energy-based models: Unifying predictive coding, equilibrium propagation, and contrastive hebbian learning

B Millidge, Y Song, T Salvatori, T Lukasiewicz… - arXiv preprint arXiv …, 2022 - arxiv.org
How the brain performs credit assignment is a fundamental unsolved problem in
neuroscience. Manybiologically plausible'algorithms have been proposed, which compute …

Learning efficient backprojections across cortical hierarchies in real time

K Max, L Kriener, G Pineda García, T Nowotny… - Nature Machine …, 2024 - nature.com
Abstract Models of sensory processing and learning in the cortex need to efficiently assign
credit to synapses in all areas. In deep learning, a known solution is error backpropagation …

Spike-timing-dependent plasticity: the relationship to rate-based learning for models with weight dynamics determined by a stable fixed point

AN Burkitt, H Meffin, DB Grayden - Neural Computation, 2004 - direct.mit.edu
Experimental evidence indicates that synaptic modification depends on the timing
relationship between the presynaptic inputs and the output spikes that they generate. In this …

Dendritic cortical microcircuits approximate the backpropagation algorithm

J Sacramento, R Ponte Costa… - Advances in neural …, 2018 - proceedings.neurips.cc
Deep learning has seen remarkable developments over the last years, many of them
inspired by neuroscience. However, the main learning mechanism behind these advances …

[HTML][HTML] Theories of error back-propagation in the brain

JCR Whittington, R Bogacz - Trends in cognitive sciences, 2019 - cell.com
This review article summarises recently proposed theories on how neural circuits in the
brain could approximate the error back-propagation algorithm used by artificial neural …

A framework for studying synaptic plasticity with neural spike train data

S Linderman, CH Stock… - Advances in neural …, 2014 - proceedings.neurips.cc
Learning and memory in the brain are implemented by complex, time-varying changes in
neural circuitry. The computational rules according to which synaptic weights change over …