Beyond factuality: A comprehensive evaluation of large language models as knowledge generators

L Chen, Y Deng, Y Bian, Z Qin, B Wu, TS Chua… - arXiv preprint arXiv …, 2023 - arxiv.org
Large language models (LLMs) outperform information retrieval techniques for downstream
knowledge-intensive tasks when being prompted to generate world knowledge. However …

Infusing internalized knowledge of language models into hybrid prompts for knowledgeable dialogue generation

J Bai, Z Yan, S Zhang, J Yang, H Guo, Z Li - Knowledge-Based Systems, 2024 - Elsevier
Existing knowledge-grounded dialogue (KGD) systems access the knowledge from an
external knowledge base, then generate the context-coherent response accordingly …

Knowledge rumination for pre-trained language models

Y Yao, P Wang, S Mao, C Tan, F Huang… - arXiv preprint arXiv …, 2023 - arxiv.org
Previous studies have revealed that vanilla pre-trained language models (PLMs) lack the
capacity to handle knowledge-intensive NLP tasks alone; thus, several works have …

Knowprefix-tuning: A two-stage prefix-tuning framework for knowledge-grounded dialogue generation

J Bai, Z Yan, Z Yang, J Yang, X Liang, H Guo… - … European Conference on …, 2023 - Springer
Existing knowledge-grounded conversation systems generate responses typically in a
retrieve-then-generate manner. They require a large knowledge base and a strong …

Measuring the knowledge acquisition-utilization gap in pretrained language models

A Kazemnejad, M Rezagholizadeh… - arXiv preprint arXiv …, 2023 - arxiv.org
While pre-trained language models (PLMs) have shown evidence of acquiring vast amounts
of knowledge, it remains unclear how much of this parametric knowledge is actually usable …

GKA-GPT: Graphical knowledge aggregation for multiturn dialog generation

Y Dong, K Qin, S Liang, A Raza, G Luo - Knowledge-Based Systems, 2025 - Elsevier
In human interaction, effective communication relies on shared cognitive processes that
facilitate the ability of individuals to comprehend the intended message of their interlocutors …

MixEI: Mixing explicit and implicit commonsense knowledge in open-domain dialogue response generation

S Wu, J Yu, W Zhou - Neurocomputing, 2025 - Elsevier
The inadequate awareness of real-world knowledge often causes machines to produce
generic responses, such as 'I think so.', which may bring the degression of user interests …

RA2FD: Distilling Faithfulness into Efficient Dialogue Systems

Z Zhu, Y Liao, C Xu, Y Guan, Y Wang… - Proceedings of the 2024 …, 2024 - aclanthology.org
Generating faithful and fast responses is crucial in the knowledge-grounded dialogue.
Retrieval Augmented Generation (RAG) strategies are effective but are inference inefficient …

Knowledge Interpolated Conditional Variational Auto-Encoder for Knowledge Grounded Dialogues

X Liang, J Du, T Niu, L Zhou, R Xu - Applied Sciences, 2023 - mdpi.com
In the Knowledge Grounded Dialogue (KGD) generation, the explicit modeling of instance-
variety of knowledge specificity and its seamless fusion with the dialogue context remains …

Post-hoc Utterance Refining Method by Entity Mining for Faithful Knowledge Grounded Conversations

Y Jang, S Son, J Lee, J Son, Y Hur, J Lim… - arXiv preprint arXiv …, 2024 - arxiv.org
Despite the striking advances in recent language generation performance, model-generated
responses have suffered from the chronic problem of hallucinations that are either untrue or …