Neural network models are potential tools for improving our understanding of complex brain functions. To address this goal, these models need to be neurobiologically realistic …
Recurrently connected networks of spiking neurons underlie the astounding information processing capabilities of the brain. Yet in spite of extensive research, how they can learn …
Synaptic plasticity is believed to be a key physiological mechanism for learning. It is well established that it depends on pre-and postsynaptic activity. However, models that rely …
We introduce the HSIC (Hilbert-Schmidt independence criterion) bottleneck for training deep neural networks. The HSIC bottleneck is an alternative to the conventional cross-entropy …
From the conception of Baddeley's visuospatial sketchpad, visual working memory and visual attention have been closely linked concepts. An attractive model has advocated unity …
Training deep neural networks with the error backpropagation algorithm is considered implausible from a biological perspective. Numerous recent publications suggest elaborate …
For both humans and machines, the essence of learning is to pinpoint which components in its information processing pipeline are responsible for an error in its output, a challenge that …
Convolutional Neural Networks (CNN) are a class of machine learning models predominately used in computer vision tasks and can achieve human-like performance …
Recent advances in artificial intelligence (AI) and neuroscience are impressive. In AI, this includes the development of computer programs that can beat a grandmaster at GO or …