作者
Eden Belouadah
发表日期
2021/11/29
机构
Ecole nationale supérieure Mines-Télécom Atlantique
简介
Incremental learning (IL) enables the adaptation of artificial agents to dynamic environments in which data is presented in streams. This type of learning is needed when access to past data is limited or impossible but is affected by catastrophic forgetting. This phenomenon consists of a drastic performance drop for previously learned information when ingesting new data. One way to tackle this problem is to use a limited memory of the past to refresh previously learned knowledge. Currently, memory-based approaches achieve the best state-of-the-art results. In this thesis, we present many methods with and without memory of the past. Our methods deal with catastrophic forgetting either by (1) calibrating past and new classes scores at the end of the network, or (2) performing initial class weights replay, or (3) transferring knowledge between reference and target datasets. We notably investigate the usefulness of the widely used knowledge distillation and the effect of enabling or not a memory of the past. Extensive experiments against a range of state-of-the-art approaches were conducted in order to validate the efficiency of our methods.