Looking through the past: better knowledge retention for generative replay in continual learning

V Khan, S Cygert, B Twardowski… - Proceedings of the …, 2023 - openaccess.thecvf.com
Proceedings of the IEEE/CVF International Conference on …, 2023openaccess.thecvf.com
In this work, we improve the generative replay in a continual learning setting. We notice that
in VAE-based generative replay, the generated features are quite far from the original ones
when mapped to the latent space. Therefore, we propose modifications that allow the model
to learn and generate complex data. More specifically, we incorporate the distillation in
latent space between the current and previous models to reduce feature drift. Additionally, a
latent matching for the reconstruction and original data is proposed to improve generated …
Abstract
In this work, we improve the generative replay in a continual learning setting. We notice that in VAE-based generative replay, the generated features are quite far from the original ones when mapped to the latent space. Therefore, we propose modifications that allow the model to learn and generate complex data. More specifically, we incorporate the distillation in latent space between the current and previous models to reduce feature drift. Additionally, a latent matching for the reconstruction and original data is proposed to improve generated features alignment. Further, based on the observation that the reconstructions are better for preserving knowledge, we add the cycling of generations through the previously trained model to make them closer to the original data. Our method outperforms other generative replay methods in various scenarios.
openaccess.thecvf.com
以上显示的是最相近的搜索结果。 查看全部搜索结果