Backdoor pre-trained models can transfer to all

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi… - arXiv preprint arXiv …, 2021 - arxiv.org
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

[PDF][PDF] Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi, C Fang, J Yin… - 2021 - nesa.zju.edu.cn
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

[PDF][PDF] Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi, C Fang, J Yin… - 2021 - researchgate.net
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

[PDF][PDF] Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi, C Fang, J Yin… - 2021 - csp.whu.edu.cn
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi… - 27th ACM Annual …, 2021 - pure.psu.edu
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

[PDF][PDF] Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi, C Fang, J Yin… - 2021 - datasec.whu.edu.cn
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi… - Proceedings of the …, 2021 - dl.acm.org
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

Backdoor Pre-trained Models Can Transfer to All

L Shen, S Ji, X Zhang, J Li, J Chen, J Shi… - arXiv e …, 2021 - ui.adsabs.harvard.edu
Pre-trained general-purpose language models have been a dominating component in
enabling real-world natural language processing (NLP) applications. However, a pre-trained …

Backdoor Pre-trained Models Can Transfer to All

L Shen - nesa.zju.edu.cn
Backdoor Pre-trained Models Can Transfer to All Page 1 Backdoor Pre-trained Models Can
Transfer to All Lujia Shen Shouling Ji Xuhong Zhang Jinfeng Li Jing Chen Jie Shi Chengfang …