On the importance of severely testing deep learning models of cognition

JS Bowers, G Malhotra, F Adolfi, M Dujmović… - Cognitive Systems …, 2023 - Elsevier
Researchers studying the correspondences between Deep Neural Networks (DNNs) and
humans often give little consideration to severe testing when drawing conclusions from …

Reconciling shared versus context-specific information in a neural network model of latent causes

Q Lu, TT Nguyen, Q Zhang, U Hasson, TL Griffiths… - Scientific reports, 2024 - nature.com
It has been proposed that, when processing a stream of events, humans divide their
experiences in terms of inferred latent causes (LCs) to support context-dependent learning …

Dual memory model for experience-once task-incremental lifelong learning

G Ma, R Jiang, L Wang, H Tang - Neural Networks, 2023 - Elsevier
Experience replay (ER) is a widely-adopted neuroscience-inspired method to perform
lifelong learning. Nonetheless, existing ER-based approaches consider very coarse memory …

Sleep microstructure organizes memory replay

H Chang, W Tang, AM Wulf, T Nyasulu, ME Wolf… - Nature, 2025 - nature.com
Recently acquired memories are reactivated in the hippocampus during sleep, an initial step
for their consolidation,–. This process is concomitant with the hippocampal reactivation of …

Cognitive Overload Attack: Prompt Injection for Long Context

B Upadhayay, V Behzadan, A Karbasi - arXiv preprint arXiv:2410.11272, 2024 - arxiv.org
Large Language Models (LLMs) have demonstrated remarkable capabilities in performing
tasks across various domains without needing explicit retraining. This capability, known as …

A general paradigm of knowledge-driven and data-driven fusion

F Hu, W Zhong, L Ye, D Duan… - 2023 15th International …, 2023 - ieeexplore.ieee.org
Knowledge and data fusion is a vital research hotspot in current artificial intelligence. The
fusion of data-driven and knowledge-driven would be able to organically combine implicit …

Bridging Neuroscience and AI: Environmental Enrichment as a Model for Forward Knowledge Transfer

R Saxena, BL McNaughton - ArXiv, 2024 - pmc.ncbi.nlm.nih.gov
Continual learning (CL) refers to an agent's capability to learn from a continuous stream of
data and transfer knowledge without forgetting old information. One crucial aspect of CL is …

A cardiologist-like computer-aided interpretation framework to improve arrhythmia diagnosis from imbalanced training datasets

L Hu, S Huang, H Liu, Y Du, J Zhao, X Peng, D Li… - Patterns, 2023 - cell.com
Arrhythmias can pose a significant threat to cardiac health, potentially leading to serious
consequences such as stroke, heart failure, cardiac arrest, shock, and sudden death. In …

Balanced Gradient Sample Retrieval for Enhanced Knowledge Retention in Proxy-based Continual Learning

H Xu, J Wasilewski, B Krawczyk - arXiv preprint arXiv:2412.14430, 2024 - arxiv.org
Continual learning in deep neural networks often suffers from catastrophic forgetting, where
representations for previous tasks are overwritten during subsequent training. We propose a …

Environmental enrichment: a biological model of forward transfer in continual learning

R Saxena, BL McNaughton - arXiv preprint arXiv:2405.07295, 2024 - arxiv.org
Continual learning (CL) refers to an agent's capability to learn from a continuous stream of
data and transfer knowledge without forgetting old information. One crucial aspect of CL is …