Class-incremental learning by knowledge distillation with adaptive feature consolidation

M Kang, J Park, B Han - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

M Kang, J Park, B Han - arXiv preprint arXiv:2204.00895, 2022 - arxiv.org
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

M Kang, J Park, B Han - 2022 IEEE/CVF Conference on Computer …, 2022 - computer.org
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

M Kang, J Park, B Han - 2022 IEEE/CVF Conference on …, 2022 - ieeexplore.ieee.org
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

M Kang, J Park, B Han - openreview.net
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

M Kang, J Park, B Han - arXiv e-prints, 2022 - ui.adsabs.harvard.edu
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …