Deep neural networks (DNNs) often suffer from" catastrophic forgetting" during incremental learning (IL)---an abrupt degradation of performance on the original set of classes when the …
Although deep learning approaches have stood out in recent years due to their state-of-the- art results, they continue to suffer from catastrophic forgetting, a dramatic decrease in overall …
B Zhao, X Xiao, G Gan, B Zhang… - Proceedings of the …, 2020 - openaccess.thecvf.com
Deep neural networks (DNNs) have been applied in class incremental learning, which aims to solve common real-world problems of learning new classes continually. One drawback of …
Y Wu, Y Chen, L Wang, Y Ye, Z Liu… - Proceedings of the …, 2019 - openaccess.thecvf.com
Modern machine learning suffers from catastrophic forgetting when learning new classes incrementally. The performance dramatically degrades due to the missing data of old …
J He, R Mao, Z Shao, F Zhu - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com
Modern deep learning approaches have achieved great success in many vision applications by training a model using all available task-specific data. However, there are two major …
Despite the impressive performance in many individual tasks, deep neural networks suffer from catastrophic forgetting when learning new tasks incrementally. Recently, various …
Conventionally, deep neural networks are trained offline, relying on a large dataset prepared in advance. This paradigm is often challenged in real-world applications, eg online …
Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new …
Deep learning systems typically suffer from catastrophic forgetting of past knowledge when acquiring new skills continually. In this paper, we emphasize two dilemmas, representation …