Kernelized information bottleneck leads to biologically plausible 3-factor hebbian learning in deep networks

R Pogodin, P Latham - Advances in Neural Information …, 2020 - proceedings.neurips.cc
The state-of-the art machine learning approach to training deep neural networks,
backpropagation, is implausible for real neural networks: neurons need to know their …

Credit assignment through broadcasting a global error vector

D Clark, LF Abbott, SY Chung - Advances in Neural …, 2021 - proceedings.neurips.cc
Backpropagation (BP) uses detailed, unit-specific feedback to train deep neural networks
(DNNs) with remarkable success. That biological neural circuits appear to perform credit …

Feature Learning in -regularized DNNs: Attraction/Repulsion and Sparsity

A Jacot, E Golikov, C Hongler… - Advances in Neural …, 2022 - proceedings.neurips.cc
We study the loss surface of DNNs with $ L_ {2} $ regularization. Weshow that the loss in
terms of the parameters can be reformulatedinto a loss in terms of the layerwise activations …

Normative framework for deriving neural networks with multicompartmental neurons and non-Hebbian plasticity

D Lipshutz, Y Bahroun, S Golkar, AM Sengupta… - PRX Life, 2023 - APS
An established normative approach for understanding the algorithmic basis of neural
computation is to derive online algorithms from principled computational objectives and …

The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules

C Scholl, ME Rule, MH Hennig - PLoS computational biology, 2021 - journals.plos.org
During development, biological neural networks produce more synapses and neurons than
needed. Many of these synapses and neurons are later removed in a process known as …

A normative and biologically plausible algorithm for independent component analysis

Y Bahroun, D Chklovskii… - Advances in Neural …, 2021 - proceedings.neurips.cc
The brain effortlessly solves blind source separation (BSS) problems, but the algorithm it
uses remains elusive. In signal processing, linear BSS problems are often solved by …

A spring-block theory of feature learning in deep neural networks

C Shi, L Pan, I Dokmanić - arXiv preprint arXiv:2407.19353, 2024 - arxiv.org
Feature-learning deep nets progressively collapse data to a regular low-dimensional
geometry. How this phenomenon emerges from collective action of nonlinearity, noise …

Training energy-based single-layer Hopfield and oscillatory networks with unsupervised and supervised algorithms for image classification

M Abernot, A Todri-Sanial - Neural Computing and Applications, 2023 - Springer
This paper investigates how to solve image classification with Hopfield neural networks
(HNNs) and oscillatory neural networks (ONNs). This is a first attempt to apply ONNs for …

Brain-Inspired Machine Intelligence: A Survey of Neurobiologically-Plausible Credit Assignment

AG Ororbia - arXiv preprint arXiv:2312.09257, 2023 - arxiv.org
In this survey, we examine algorithms for conducting credit assignment in artificial neural
networks that are inspired or motivated by neurobiology, unifying these various processes …

[HTML][HTML] Scalable bio-inspired training of deep neural networks with FastHebb

G Lagani, F Falchi, C Gennaro, H Fassold, G Amato - Neurocomputing, 2024 - Elsevier
Recent work on sample efficient training of Deep Neural Networks (DNNs) proposed a semi-
supervised methodology based on biologically inspired Hebbian learning, combined with …