S-prompts learning with pre-trained transformers: An occam's razor for domain incremental learning

Y Wang, Z Huang, X Hong - Advances in Neural …, 2022 - proceedings.neurips.cc
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning

Y Wang, Z Huang, X Hong - arXiv preprint arXiv:2207.12819, 2022 - arxiv.org
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning

Y Wang, Z Huang, X Hong - Advances in Neural Information …, 2022 - openreview.net
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning

Y Wang, Z Huang, X Hong - neurips.cc
S-Prompts Learning with Pre-trained Transformers: An Occam’s Razor for Domain Incremental
Learning Page 1 S-Prompts Learning with Pre-trained Transformers: An Occam’s Razor for …

S-prompts learning with pre-trained transformers: An Occam's razor for domain incremental learning

Y WANG, Z HUANG, X HONG - 2022 - ink.library.smu.edu.sg
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning

Y Wang, Z Huang, X Hong - arXiv e-prints, 2022 - ui.adsabs.harvard.edu
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

S-prompts learning with pre-trained transformers: an occam's razor for domain incremental learning

Y Wang, Z Huang, X Hong - … of the 36th International Conference on …, 2022 - dl.acm.org
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

[PDF][PDF] S-prompts learning with pre-trained transformers: An Occam's razor for domain incremental learning.(2022)

Y WANG, Z HUANG, X HONG - Proceedings of the 36th …, 2022 - ink.library.smu.edu.sg
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …