Abstract Retrieval-Augmented Generation (RAG) enhances Large Language Model (LLM) output by providing prior knowledge as context to input. This is beneficial for knowledge …
We present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve …
Though nearest neighbor Machine Translation ($ k $ NN-MT)\citep {khandelwal2020nearest } has proved to introduce significant performance boosts over standard neural MT systems, it …
Conducting literature reviews for scientific papers is essential for understanding research, its limitations, and building on existing work. It is a tedious task which makes an automatic …
Large language models can produce fluent dialogue but often hallucinate factual inaccuracies. While retrieval-augmented models help alleviate this issue, they still face a …
Inspired by the notion that``{\it to copy is easier than to memorize}``, in this work, we introduce GNN-LM, which extends the vanilla neural language model (LM) by allowing to …
Inspired by recent advances in retrieval augmented methods in NLP~\citep { khandelwal2019generalization, khandelwal2020nearest, meng2021gnn}, in this paper, we …
Y Dai, Z Zhang, Q Liu, Q Cui, W Li, Y Du… - arXiv preprint arXiv …, 2023 - arxiv.org
$ k $ NN-MT is a straightforward yet powerful approach for fast domain adaptation, which directly plugs pre-trained neural machine translation (NMT) models with domain-specific …
G Yu, L Liu, H Jiang, S Shi, X Ao - Findings of the Association for …, 2023 - aclanthology.org
Retrieval-augmented methods are successful in the standard scenario where the retrieval space is sufficient; whereas in the few-shot scenario with limited retrieval space, this paper …