J Kalla, S Biswas - European Conference on Computer Vision, 2022 - Springer
Few-shot class-incremental learning (FSCIL) aims to learn progressively about new classes with very few labeled samples, without forgetting the knowledge of already learnt classes …
DW Zhou, HJ Ye, DC Zhan - Proceedings of the 29th ACM International …, 2021 - dl.acm.org
Traditional learning systems are trained in closed-world for a fixed number of classes, and need pre-collected datasets in advance. However, new classes often emerge in real-world …
Conclusion We have presented PyCIL, a classincremental learning toolbox written in Python. It contains implementations of a number of founding studies of CIL, but also provides …
As a challenging problem, few-shot class-incremental learning (FSCIL) continually learns a sequence of tasks, confronting the dilemma between slow forgetting of old knowledge and …
Neural networks suffer from catastrophic forgetting when sequentially learning tasks phase- by-phase, making them inapplicable in dynamically updated systems. Class-incremental …
Y Liu, X Hong, X Tao, S Dong, J Shi… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Deep models have shown to be vulnerable to catastrophic forgetting, a phenomenon that the recognition performance on old data degrades when a pre-trained model is fine-tuned …
Data-driven algorithms are studied and deployed in diverse domains to support critical decisions, directly impacting people's well-being. As a result, a growing community of …
S Kim, L Noci, A Orvieto… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
In contrast to the natural capabilities of humans to learn new tasks in a sequential fashion, neural networks are known to suffer from catastrophic forgetting, where the model's …
Z Qian, X Wang, X Duan, P Qin… - Proceedings of the …, 2023 - openaccess.thecvf.com
In the real world, a desirable Visual Question Answering model is expected to provide correct answers to new questions and images in a continual setting (recognized as CL …