Adaptive nearest neighbor machine translation

X Zheng, Z Zhang, J Guo, S Huang, B Chen… - arXiv preprint arXiv …, 2021 - arxiv.org
X Zheng, Z Zhang, J Guo, S Huang, B Chen, W Luo, J Chen
arXiv preprint arXiv:2105.13022, 2021arxiv.org
kNN-MT, recently proposed by Khandelwal et al.(2020a), successfully combines pre-trained
neural machine translation (NMT) model with token-level k-nearest-neighbor (kNN) retrieval
to improve the translation accuracy. However, the traditional kNN algorithm used in kNN-MT
simply retrieves a same number of nearest neighbors for each target token, which may
cause prediction errors when the retrieved neighbors include noises. In this paper, we
propose Adaptive kNN-MT to dynamically determine the number of k for each target token …
kNN-MT, recently proposed by Khandelwal et al. (2020a), successfully combines pre-trained neural machine translation (NMT) model with token-level k-nearest-neighbor (kNN) retrieval to improve the translation accuracy. However, the traditional kNN algorithm used in kNN-MT simply retrieves a same number of nearest neighbors for each target token, which may cause prediction errors when the retrieved neighbors include noises. In this paper, we propose Adaptive kNN-MT to dynamically determine the number of k for each target token. We achieve this by introducing a light-weight Meta-k Network, which can be efficiently trained with only a few training samples. On four benchmark machine translation datasets, we demonstrate that the proposed method is able to effectively filter out the noises in retrieval results and significantly outperforms the vanilla kNN-MT model. Even more noteworthy is that the Meta-k Network learned on one domain could be directly applied to other domains and obtain consistent improvements, illustrating the generality of our method. Our implementation is open-sourced at https://github.com/zhengxxn/adaptive-knn-mt.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果