Retrieval augmentation reduces hallucination in conversation

K Shuster, S Poff, M Chen, D Kiela, J Weston - arXiv preprint arXiv …, 2021 - arxiv.org
Despite showing increasingly human-like conversational abilities, state-of-the-art dialogue
models often suffer from factual incorrectness and hallucination of knowledge (Roller et al …

CBR-RAG: case-based reasoning for retrieval augmented generation in LLMs for legal question answering

N Wiratunga, R Abeyratne, L Jayawardena… - … Conference on Case …, 2024 - Springer
Abstract Retrieval-Augmented Generation (RAG) enhances Large Language Model (LLM)
output by providing prior knowledge as context to input. This is beneficial for knowledge …

Relational memory-augmented language models

Q Liu, D Yogatama, P Blunsom - Transactions of the Association for …, 2022 - direct.mit.edu
We present a memory-augmented approach to condition an autoregressive language model
on a knowledge graph. We represent the graph as a collection of relation triples and retrieve …

Fast nearest neighbor machine translation

Y Meng, X Li, X Zheng, F Wu, X Sun, T Zhang… - arXiv preprint arXiv …, 2021 - arxiv.org
Though nearest neighbor Machine Translation ($ k $ NN-MT)\citep {khandelwal2020nearest
} has proved to introduce significant performance boosts over standard neural MT systems, it …

LitLLM: A Toolkit for Scientific Literature Review

S Agarwal, IH Laradji, L Charlin, C Pal - arXiv preprint arXiv:2402.01788, 2024 - arxiv.org
Conducting literature reviews for scientific papers is essential for understanding research, its
limitations, and building on existing work. It is a tedious task which makes an automatic …

Reason first, then respond: Modular generation for knowledge-infused dialogue

L Adolphs, K Shuster, J Urbanek, A Szlam… - arXiv preprint arXiv …, 2021 - arxiv.org
Large language models can produce fluent dialogue but often hallucinate factual
inaccuracies. While retrieval-augmented models help alleviate this issue, they still face a …

Gnn-lm: Language modeling based on global contexts via gnn

Y Meng, S Zong, X Li, X Sun, T Zhang, F Wu… - arXiv preprint arXiv …, 2021 - arxiv.org
Inspired by the notion that``{\it to copy is easier than to memorize}``, in this work, we
introduce GNN-LM, which extends the vanilla neural language model (LM) by allowing to …

NN-NER: Named Entity Recognition with Nearest Neighbor Search

S Wang, X Li, Y Meng, T Zhang, R Ouyang, J Li… - arXiv preprint arXiv …, 2022 - arxiv.org
Inspired by recent advances in retrieval augmented methods in NLP~\citep {
khandelwal2019generalization, khandelwal2020nearest, meng2021gnn}, in this paper, we …

Simple and scalable nearest neighbor machine translation

Y Dai, Z Zhang, Q Liu, Q Cui, W Li, Y Du… - arXiv preprint arXiv …, 2023 - arxiv.org
$ k $ NN-MT is a straightforward yet powerful approach for fast domain adaptation, which
directly plugs pre-trained neural machine translation (NMT) models with domain-specific …

Retrieval-augmented few-shot text classification

G Yu, L Liu, H Jiang, S Shi, X Ao - Findings of the Association for …, 2023 - aclanthology.org
Retrieval-augmented methods are successful in the standard scenario where the retrieval
space is sufficient; whereas in the few-shot scenario with limited retrieval space, this paper …