Revisiting reverse distillation for anomaly detection

TD Tien, AT Nguyen, NH Tran… - Proceedings of the …, 2023 - openaccess.thecvf.com
Anomaly detection is an important application in large-scale industrial manufacturing.
Recent methods for this task have demonstrated excellent accuracy but come with a latency …

Prompt as triggers for backdoor attack: Examining the vulnerability in language models

S Zhao, J Wen, LA Tuan, J Zhao, J Fu - arXiv preprint arXiv:2305.01219, 2023 - arxiv.org
The prompt-based learning paradigm, which bridges the gap between pre-training and fine-
tuning, achieves state-of-the-art performance on several NLP tasks, particularly in few-shot …

A survey on cross-lingual summarization

J Wang, F Meng, D Zheng, Y Liang, Z Li… - Transactions of the …, 2022 - direct.mit.edu
Cross-lingual summarization is the task of generating a summary in one language (eg,
English) for the given document (s) in a different language (eg, Chinese). Under the …

Zero-shot cross-lingual summarization via large language models

J Wang, Y Liang, F Meng, B Zou, Z Li, J Qu… - arXiv preprint arXiv …, 2023 - arxiv.org
Given a document in a source language, cross-lingual summarization (CLS) aims to
generate a summary in a different target language. Recently, the emergence of Large …

A variational hierarchical model for neural cross-lingual summarization

Y Liang, F Meng, C Zhou, J Xu, Y Chen, J Su… - arXiv preprint arXiv …, 2022 - arxiv.org
The goal of the cross-lingual summarization (CLS) is to convert a document in one language
(eg, English) to a summary in another one (eg, Chinese). Essentially, the CLS task is the …

A survey of backdoor attacks and defenses on large language models: Implications for security measures

S Zhao, M Jia, Z Guo, L Gan, X Xu, X Wu, J Fu… - arXiv preprint arXiv …, 2024 - arxiv.org
Large Language Models (LLMs), which bridge the gap between human language
understanding and complex problem-solving, achieve state-of-the-art performance on …

Towards unifying multi-lingual and cross-lingual summarization

J Wang, F Meng, D Zheng, Y Liang, Z Li, J Qu… - arXiv preprint arXiv …, 2023 - arxiv.org
To adapt text summarization to the multilingual world, previous work proposes multi-lingual
summarization (MLS) and cross-lingual summarization (CLS). However, these two tasks …

[PDF][PDF] Universal vulnerabilities in large language models: Backdoor attacks for in-context learning

S Zhao, M Jia, LA Tuan, F Pan… - arXiv preprint arXiv …, 2024 - researchgate.net
In-context learning, a paradigm bridging the gap between pre-training and fine-tuning, has
demonstrated high efficacy in several NLP tasks, especially in few-shot settings. Despite …

Improving cross-lingual information retrieval on low-resource languages via optimal transport distillation

Z Huang, P Yu, J Allan - Proceedings of the Sixteenth ACM International …, 2023 - dl.acm.org
Benefiting from transformer-based pre-trained language models, neural ranking models
have made significant progress. More recently, the advent of multilingual pre-trained …

Understanding translationese in cross-lingual summarization

J Wang, F Meng, Y Liang, T Zhang, J Xu, Z Li… - arXiv preprint arXiv …, 2022 - arxiv.org
Given a document in a source language, cross-lingual summarization (CLS) aims at
generating a concise summary in a different target language. Unlike monolingual …