A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

Mechanisms of systems memory consolidation during sleep

JG Klinzing, N Niethard, J Born - Nature neuroscience, 2019 - nature.com
Long-term memory formation is a major function of sleep. Based on evidence from
neurophysiological and behavioral studies mainly in humans and rodents, we consider the …

Dualprompt: Complementary prompting for rehearsal-free continual learning

Z Wang, Z Zhang, S Ebrahimi, R Sun, H Zhang… - … on Computer Vision, 2022 - Springer
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …

Data distributional properties drive emergent in-context learning in transformers

S Chan, A Santoro, A Lampinen… - Advances in …, 2022 - proceedings.neurips.cc
Large transformer-based models are able to perform in-context few-shot learning, without
being explicitly trained for it. This observation raises the question: what aspects of the …

Learning to prompt for continual learning

Z Wang, Z Zhang, CY Lee, H Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …

[HTML][HTML] 2022 roadmap on neuromorphic computing and engineering

DV Christensen, R Dittmann… - Neuromorphic …, 2022 - iopscience.iop.org
Modern computation based on von Neumann architecture is now a mature cutting-edge
science. In the von Neumann architecture, processing and memory units are implemented …

[HTML][HTML] Brain-inspired replay for continual learning with artificial neural networks

GM Van de Ven, HT Siegelmann, AS Tolias - Nature communications, 2020 - nature.com
Artificial neural networks suffer from catastrophic forgetting. Unlike humans, when these
networks are trained on something new, they rapidly forget what was learned before. In the …

Adversarial reciprocal points learning for open set recognition

G Chen, P Peng, X Wang, Y Tian - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Open set recognition (OSR), aiming to simultaneously classify the seen classes and identify
the unseen classes as' unknown', is essential for reliable machine learning. The key …

Dualnet: Continual learning, fast and slow

Q Pham, C Liu, S Hoi - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Abstract According to Complementary Learning Systems (CLS) theory~\cite
{mcclelland1995there} in neuroscience, humans do effective\emph {continual learning} …

Bridging biological and artificial neural networks with emerging neuromorphic devices: fundamentals, progress, and challenges

J Tang, F Yuan, X Shen, Z Wang, M Rao… - Advanced …, 2019 - Wiley Online Library
As the research on artificial intelligence booms, there is broad interest in brain‐inspired
computing using novel neuromorphic devices. The potential of various emerging materials …