Recent work on non-autoregressive neural machine translation (NAT) aims at improving the efficiency by parallel decoding without sacrificing the quality. However, existing NAT …
J Gu, X Kong - arXiv preprint arXiv:2012.15833, 2020 - arxiv.org
Fully non-autoregressive neural machine translation (NAT) is proposed to simultaneously predict tokens with single forward of neural networks, which significantly reduces the …
This paper presents two strong methods, CTC and Imputer, for non-autoregressive machine translation that model latent alignments with dynamic programming. We revisit CTC for …
C Du, Z Tu, J Jiang - International conference on machine …, 2021 - proceedings.mlr.press
We propose a new training objective named order-agnostic cross entropy (OaXE) for fully non-autoregressive translation (NAT) models. OaXE improves the standard cross-entropy …
F Li, J Chen, X Zhang - Electronics, 2023 - mdpi.com
Non-autoregressive neural machine translation (NAMT) has received increasing attention recently in virtue of its promising acceleration paradigm for fast decoding. However, these …
Q Ran, Y Lin, P Li, J Zhou - Proceedings of the AAAI Conference on …, 2021 - ojs.aaai.org
Non-autoregressive neural machine translation (NAT) generates each target word in parallel and has achieved promising inference acceleration. However, existing NAT models still …
J Song, S Kim, S Yoon - arXiv preprint arXiv:2109.06481, 2021 - arxiv.org
Non-autoregressive neural machine translation (NART) models suffer from the multi- modality problem which causes translation inconsistency such as token repetition. Most …
Y Zhang, K Sharma, Y Liu - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Recent years have witnessed an increasing use of coordinated accounts on social media, operated by misinformation campaigns to influence public opinion and manipulate social …
Non-autoregressive Transformer (NAT) is a family of text generation models, which aims to reduce the decoding latency by predicting the whole sentences in parallel. However, such …