The goal of text-to-text generation is to make machines express like a human in many applications such as conversation, summarization, and translation. It is one of the most …
Large language models (LLMs), such as ChatGPT, are able to generate human-like, fluent responses for many downstream tasks, eg, task-oriented dialog and question answering …
We study knowledge-grounded dialogue generation with pre-trained language models. To leverage the redundant external knowledge under capacity constraint, we propose …
M Huang, X Zhu, J Gao - ACM Transactions on Information Systems …, 2020 - dl.acm.org
There is a resurgent interest in developing intelligent open-domain dialog systems due to the availability of large amounts of conversational data and the recent progress on neural …
K Zhou, Y Zhou, WX Zhao, X Wang, JR Wen - arXiv preprint arXiv …, 2020 - arxiv.org
Conversational recommender systems (CRS) aim to recommend high-quality items to users through interactive conversations. To develop an effective CRS, the support of high-quality …
H Li, Y Su, D Cai, Y Wang, L Liu - arXiv preprint arXiv:2202.01110, 2022 - arxiv.org
Recently, retrieval-augmented text generation attracted increasing attention of the computational linguistics community. Compared with conventional generation models …
Knowledge-grounded dialogue systems are intended to convey information that is based on evidence provided in a given source text. We discuss the challenges of training a generative …
D Cai, Y Wang, L Liu, S Shi - Proceedings of the 45th international ACM …, 2022 - dl.acm.org
Recently retrieval-augmented text generation has achieved state-of-the-art performance in many NLP tasks and has attracted increasing attention of the NLP and IR community, this …
B Kim, J Ahn, G Kim - arXiv preprint arXiv:2002.07510, 2020 - arxiv.org
Knowledge-grounded dialogue is a task of generating an informative response based on both discourse context and external knowledge. As we focus on better modeling the …