[HTML][HTML] Continual lifelong learning with neural networks: A review

GI Parisi, R Kemker, JL Part, C Kanan, S Wermter - Neural networks, 2019 - Elsevier
Humans and animals have the ability to continually acquire, fine-tune, and transfer
knowledge and skills throughout their lifespan. This ability, referred to as lifelong learning, is …

[HTML][HTML] Cellular mechanisms of conscious processing

J Aru, M Suzuki, ME Larkum - Trends in cognitive sciences, 2020 - cell.com
Recent breakthroughs in neurobiology indicate that the time is ripe to understand how
cellular-level mechanisms are related to conscious experience. Here, we highlight the …

Foster: Feature boosting and compression for class-incremental learning

FY Wang, DW Zhou, HJ Ye, DC Zhan - European conference on computer …, 2022 - Springer
The ability to learn new concepts continually is necessary in this ever-changing world.
However, deep neural networks suffer from catastrophic forgetting when learning new …

Der: Dynamically expandable representation for class incremental learning

S Yan, J Xie, X He - … of the IEEE/CVF conference on …, 2021 - openaccess.thecvf.com
We address the problem of class incremental learning, which is a core step towards
achieving adaptive vision intelligence. In particular, we consider the task setting of …

[HTML][HTML] Toward an integration of deep learning and neuroscience

AH Marblestone, G Wayne, KP Kording - Frontiers in computational …, 2016 - frontiersin.org
Neuroscience has focused on the detailed implementation of computation, studying neural
codes, dynamics and circuits. In machine learning, however, artificial neural networks tend …

Continuous learning in single-incremental-task scenarios

D Maltoni, V Lomonaco - Neural Networks, 2019 - Elsevier
It was recently shown that architectural, regularization and rehearsal strategies can be used
to train deep models sequentially on a number of disjoint tasks without forgetting previously …

Continual learning with lifelong vision transformer

Z Wang, L Liu, Y Duan, Y Kong… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Continual learning methods aim at training a neural network from sequential data with
streaming labels, relieving catastrophic forgetting. However, existing methods are based on …

Cybernetic big five theory

CG DeYoung - Journal of research in personality, 2015 - Elsevier
Cybernetics, the study of goal-directed, adaptive systems, is the best framework for an
integrative theory of personality. Cybernetic Big Five Theory attempts to provide a …

The now-or-never bottleneck: A fundamental constraint on language

MH Christiansen, N Chater - Behavioral and brain sciences, 2016 - cambridge.org
Memory is fleeting. New material rapidly obliterates previous material. How, then, can the
brain deal successfully with the continual deluge of linguistic input? We argue that, to deal …

Survey of state-of-the-art mixed data clustering algorithms

A Ahmad, SS Khan - Ieee Access, 2019 - ieeexplore.ieee.org
Mixed data comprises both numeric and categorical features, and mixed datasets occur
frequently in many domains, such as health, finance, and marketing. Clustering is often …