Long-term memory formation is a major function of sleep. Based on evidence from neurophysiological and behavioral studies mainly in humans and rodents, we consider the …
Continual learning aims to enable a single model to learn a sequence of tasks without catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
Large transformer-based models are able to perform in-context few-shot learning, without being explicitly trained for it. This observation raises the question: what aspects of the …
The mainstream paradigm behind continual learning has been to adapt the model parameters to non-stationary data distributions, where catastrophic forgetting is the central …
Modern computation based on von Neumann architecture is now a mature cutting-edge science. In the von Neumann architecture, processing and memory units are implemented …
Artificial neural networks suffer from catastrophic forgetting. Unlike humans, when these networks are trained on something new, they rapidly forget what was learned before. In the …
Open set recognition (OSR), aiming to simultaneously classify the seen classes and identify the unseen classes as' unknown', is essential for reliable machine learning. The key …
Q Pham, C Liu, S Hoi - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Abstract According to Complementary Learning Systems (CLS) theory~\cite {mcclelland1995there} in neuroscience, humans do effective\emph {continual learning} …
As the research on artificial intelligence booms, there is broad interest in brain‐inspired computing using novel neuromorphic devices. The potential of various emerging materials …