E Belouadah, A Popescu, I Kanellos - arXiv preprint arXiv:2008.13710, 2020 - arxiv.org
Incremental Learning (IL) is useful when artificial systems need to deal with streams of data and do not have access to all data at all times. The most challenging setting requires a …
Conventionally, deep neural networks are trained offline, relying on a large dataset prepared in advance. This paradigm is often challenged in real-world applications, eg online …
Despite the impressive performance in many individual tasks, deep neural networks suffer from catastrophic forgetting when learning new tasks incrementally. Recently, various …
Y Wu, Y Chen, L Wang, Y Ye, Z Liu… - Proceedings of the …, 2019 - openaccess.thecvf.com
Modern machine learning suffers from catastrophic forgetting when learning new classes incrementally. The performance dramatically degrades due to the missing data of old …
Deep neural networks (DNNs) often suffer from" catastrophic forgetting" during incremental learning (IL)---an abrupt degradation of performance on the original set of classes when the …
Abstract Class Incremental Learning (CIL) aims at learning a classifier in a phase-by-phase manner, in which only data of a subset of the classes are provided at each phase. Previous …
Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new …
Classical deep neural networks are limited in their ability to learn from emerging streams of training data. When trained sequentially on new or evolving tasks, their performance …
Recent years have witnessed growing interests in developing deep models for incremental learning. However, existing approaches often utilize the fixed structure and online …