Recently, retrieval-augmented text generation attracted increasing attention of the computational linguistics community. Compared with conventional generation models …
D Saunders - Journal of Artificial Intelligence Research, 2022 - jair.org
The development of deep learning techniques has allowed Neural Machine Translation (NMT) models to become extremely powerful, given sufficient training data and training time …
Large language models (LLMs) demonstrate remarkable machine translation (MT) abilities via prompting, even though they were not explicitly trained for this task. However, even given …
D Wang, K Fan, B Chen, D Xiong - arXiv preprint arXiv:2204.06175, 2022 - arxiv.org
k-Nearest-Neighbor Machine Translation (kNN-MT) has been recently proposed as a non- parametric solution for domain adaptation in neural machine translation (NMT). It aims to …
F Nie, M Chen, Z Zhang, X Cheng - arXiv preprint arXiv:2212.02216, 2022 - arxiv.org
Pre-trained language models (PLMs) have exhibited remarkable few-shot learning capabilities when provided a few examples in a natural language prompt as demonstrations …
Open-sourced large language models (LLMs) have demonstrated remarkable efficacy in various tasks with instruction tuning. However, these models can sometimes struggle with …
Semi-parametric models, which augment generation with retrieval, have led to impressive results in language modeling and machine translation, due to their ability to retrieve fine …
k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research direction of NMT in recent years. Its main idea is to retrieve useful key-value pairs from an …
Y Dai, Z Zhang, Q Liu, Q Cui, W Li, Y Du… - arXiv preprint arXiv …, 2023 - arxiv.org
$ k $ NN-MT is a straightforward yet powerful approach for fast domain adaptation, which directly plugs pre-trained neural machine translation (NMT) models with domain-specific …