Meta-learning families of plasticity rules in recurrent spiking networks using simulation-based inference

B Confavreux, P Ramesh… - Advances in …, 2023 - proceedings.neurips.cc
There is substantial experimental evidence that learning and memory-related behaviours
rely on local synaptic changes, but the search for distinct plasticity rules has been driven by …

Meta-learning synaptic plasticity and memory addressing for continual familiarity detection

D Tyulmankov, GR Yang, LF Abbott - Neuron, 2022 - cell.com
Over the course of a lifetime, we process a continual stream of information. Extracted from
this stream, memories must be efficiently encoded and stored in an addressable manner for …

[HTML][HTML] Forms of explanation and understanding for neuroscience and artificial intelligence

JAF Thompson - Journal of Neurophysiology, 2021 - journals.physiology.org
Much of the controversy evoked by the use of deep neural networks as models of biological
neural systems amount to debates over what constitutes scientific progress in neuroscience …

Meta-learning biologically plausible plasticity rules with random feedback pathways

N Shervani-Tabar, R Rosenbaum - Nature Communications, 2023 - nature.com
Backpropagation is widely used to train artificial neural networks, but its relationship to
synaptic plasticity in the brain is unknown. Some biological models of backpropagation rely …

Credit assignment through broadcasting a global error vector

D Clark, LF Abbott, SY Chung - Advances in Neural …, 2021 - proceedings.neurips.cc
Backpropagation (BP) uses detailed, unit-specific feedback to train deep neural networks
(DNNs) with remarkable success. That biological neural circuits appear to perform credit …

A meta-learning approach to (re) discover plasticity rules that carve a desired function into a neural network

B Confavreux, F Zenke, E Agnes… - Advances in Neural …, 2020 - proceedings.neurips.cc
The search for biologically faithful synaptic plasticity rules has resulted in a large body of
models. They are usually inspired by--and fitted to--experimental data, but they rarely …

Parallel training of deep networks with local updates

M Laskin, L Metz, S Nabarro, M Saroufim… - arXiv preprint arXiv …, 2020 - arxiv.org
Deep learning models trained on large data sets have been widely successful in both vision
and language domains. As state-of-the-art deep learning architectures have continued to …

Learn2hop: Learned optimization on rough landscapes

A Merchant, L Metz, SS Schoenholz… - … on Machine Learning, 2021 - proceedings.mlr.press
Optimization of non-convex loss surfaces containing many local minima remains a critical
problem in a variety of domains, including operations research, informatics, and material …

Fast on-device adaptation for spiking neural networks via online-within-online meta-learning

B Rosenfeld, B Rajendran… - 2021 IEEE Data Science …, 2021 - ieeexplore.ieee.org
Spiking Neural Networks (SNNs) have recently gained popularity as machine learning
models for on-device edge intelligence for applications such as mobile healthcare …

Passive exposure to task-relevant stimuli enhances categorization learning

C Schmid, M Haziq, MM Baese-Berk, JM Murray… - eLife, 2024 - elifesciences.org
Learning to perform a perceptual decision task is generally achieved through sessions of
effortful practice with feedback. Here, we investigated how passive exposure to task-relevant …