Backpropagation (BP) uses detailed, unit-specific feedback to train deep neural networks (DNNs) with remarkable success. That biological neural circuits appear to perform credit …
We study the loss surface of DNNs with $ L_ {2} $ regularization. Weshow that the loss in terms of the parameters can be reformulatedinto a loss in terms of the layerwise activations …
An established normative approach for understanding the algorithmic basis of neural computation is to derive online algorithms from principled computational objectives and …
C Scholl, ME Rule, MH Hennig - PLoS computational biology, 2021 - journals.plos.org
During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as …
The brain effortlessly solves blind source separation (BSS) problems, but the algorithm it uses remains elusive. In signal processing, linear BSS problems are often solved by …
C Shi, L Pan, I Dokmanić - arXiv preprint arXiv:2407.19353, 2024 - arxiv.org
Feature-learning deep nets progressively collapse data to a regular low-dimensional geometry. How this phenomenon emerges from collective action of nonlinearity, noise …
This paper investigates how to solve image classification with Hopfield neural networks (HNNs) and oscillatory neural networks (ONNs). This is a first attempt to apply ONNs for …
AG Ororbia - arXiv preprint arXiv:2312.09257, 2023 - arxiv.org
In this survey, we examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology, unifying these various processes …
Recent work on sample efficient training of Deep Neural Networks (DNNs) proposed a semi- supervised methodology based on biologically inspired Hebbian learning, combined with …