A survey on dialogue summarization: Recent advances and new frontiers

X Feng, X Feng, B Qin - arXiv preprint arXiv:2107.03175, 2021 - arxiv.org
Dialogue summarization aims to condense the original dialogue into a shorter version
covering salient information, which is a crucial way to reduce dialogue data overload …

A survey of resource-efficient llm and multimodal foundation models

M Xu, W Yin, D Cai, R Yi, D Xu, Q Wang, B Wu… - arXiv preprint arXiv …, 2024 - arxiv.org
Large foundation models, including large language models (LLMs), vision transformers
(ViTs), diffusion, and LLM-based multimodal models, are revolutionizing the entire machine …

Exploring the capabilities of llms for code change related tasks

L Fan, J Liu, Z Liu, D Lo, X Xia, S Li - ACM Transactions on Software …, 2024 - dl.acm.org
Developers deal with code-change-related tasks daily, eg, reviewing code. Pre-trained code
and code-change-oriented models have been adapted to help developers with such tasks …

Resource-efficient Algorithms and Systems of Foundation Models: A Survey

M Xu, D Cai, W Yin, S Wang, X Jin, X Liu - ACM Computing Surveys, 2025 - dl.acm.org
Large foundation models, including large language models, vision transformers, diffusion,
and large language model based multimodal models, are revolutionizing the entire machine …

Large language models meet nlp: A survey

L Qin, Q Chen, X Feng, Y Wu, Y Zhang, Y Li… - arXiv preprint arXiv …, 2024 - arxiv.org
While large language models (LLMs) like ChatGPT have shown impressive capabilities in
Natural Language Processing (NLP) tasks, a systematic investigation of their potential in this …

On prefix-tuning for lightweight out-of-distribution detection

Y Ouyang, Y Cao, Y Gao, Z Wu, J Zhang… - Proceedings of the 61st …, 2023 - aclanthology.org
Abstract Out-of-distribution (OOD) detection, a fundamental task vexing real-world
applications, has attracted growing attention in the NLP community. Recently fine-tuning …

Compositional zero-shot domain transfer with text-to-text models

F Liu, Q Liu, S Bannur, F Pérez-García… - Transactions of the …, 2023 - direct.mit.edu
Label scarcity is a bottleneck for improving task performance in specialized domains. We
propose a novel compositional transfer learning framework (DoT5) for zero-shot domain …

Knowprefix-tuning: A two-stage prefix-tuning framework for knowledge-grounded dialogue generation

J Bai, Z Yan, Z Yang, J Yang, X Liang, H Guo… - … European Conference on …, 2023 - Springer
Existing knowledge-grounded conversation systems generate responses typically in a
retrieve-then-generate manner. They require a large knowledge base and a strong …

Few-shot query-focused summarization with prefix-merging

R Yuan, Z Wang, Z Cao, W Li - arXiv preprint arXiv:2211.16164, 2022 - arxiv.org
Query-focused summarization has been considered as an important extension for text
summarization. It aims to generate a concise highlight for a given query. Different from text …

Adpl: Adversarial prompt-based domain adaptation for dialogue summarization with knowledge disentanglement

L Zhao, F Zheng, W Zeng, K He, R Geng… - Proceedings of the 45th …, 2022 - dl.acm.org
Traditional dialogue summarization models rely on a large-scale manually-labeled corpus,
lacking generalization ability to new domains, and domain adaptation from a labeled source …