A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

Recent advances of continual learning in computer vision: An overview

H Qu, H Rahmani, L Xu, B Williams, J Liu - arXiv preprint arXiv …, 2021 - arxiv.org
In contrast to batch learning where all training data is available at once, continual learning
represents a family of methods that accumulate knowledge and learn continuously with data …

Patching open-vocabulary models by interpolating weights

G Ilharco, M Wortsman, SY Gadre… - Advances in …, 2022 - proceedings.neurips.cc
Open-vocabulary models like CLIP achieve high accuracy across many image classification
tasks. However, there are still settings where their zero-shot performance is far from optimal …

Forget-free continual learning with winning subnetworks

H Kang, RJL Mina, SRH Madjid… - International …, 2022 - proceedings.mlr.press
Abstract Inspired by Lottery Ticket Hypothesis that competitive subnetworks exist within a
dense network, we propose a continual learning method referred to as Winning …

Improving generalization in federated learning by seeking flat minima

D Caldarola, B Caputo, M Ciccone - European Conference on Computer …, 2022 - Springer
Abstract Models trained in federated settings often suffer from degraded performances and
fail at generalizing, especially when facing heterogeneous scenarios. In this work, we …

Model ratatouille: Recycling diverse models for out-of-distribution generalization

A Ramé, K Ahuja, J Zhang, M Cord… - International …, 2023 - proceedings.mlr.press
Foundation models are redefining how AI systems are built. Practitioners now follow a
standard procedure to build their machine learning solutions: from a pre-trained foundation …

Understanding the role of training regimes in continual learning

SI Mirzadeh, M Farajtabar, R Pascanu… - Advances in …, 2020 - proceedings.neurips.cc
Catastrophic forgetting affects the training of neural networks, limiting their ability to learn
multiple tasks sequentially. From the perspective of the well established plasticity-stability …

An empirical investigation of the role of pre-training in lifelong learning

SV Mehta, D Patil, S Chandar, E Strubell - Journal of Machine Learning …, 2023 - jmlr.org
The lifelong learning paradigm in machine learning is an attractive alternative to the more
prominent isolated learning scheme not only due to its resemblance to biological learning …

Mechanistic mode connectivity

ES Lubana, EJ Bigelow, RP Dick… - International …, 2023 - proceedings.mlr.press
We study neural network loss landscapes through the lens of mode connectivity, the
observation that minimizers of neural networks retrieved via training on a dataset are …

Federated continual learning with weighted inter-client transfer

J Yoon, W Jeong, G Lee, E Yang… - … on Machine Learning, 2021 - proceedings.mlr.press
There has been a surge of interest in continual learning and federated learning, both of
which are important in deep neural networks in real-world scenarios. Yet little research has …