Most existing methods on robust neural machine translation (NMT) construct adversarial examples by injecting noise into authentic examples and indiscriminately exploit two types …
G Sperduti, A Moreo - arXiv preprint arXiv:2501.16836, 2025 - arxiv.org
This survey provides an overview of the challenges of misspellings in natural language processing (NLP). While often unintentional, misspellings have become ubiquitous in digital …
Z Li, M Rei, L Specia - arXiv preprint arXiv:2103.07352, 2021 - arxiv.org
Neural Machine Translation models are sensitive to noise in the input texts, such as misspelled words and ungrammatical constructions. Existing robustness techniques …
There are around 7000 languages that are alive worldwide; among them, only 50-200 languages are well-resourced. In many regions of the world, there are languages and …
Abstract Neural Machine Translation (NMT) yields remarkable results for high-resource languages trained on large amounts of data. When there is a lack of linguistic data, the …
SM Jayanthi, A Pratapa - … of the 18th SIGMORPHON Workshop on …, 2021 - aclanthology.org
In this work, we analyze the robustness of neural machine translation systems towards grammatical perturbations in the source. In particular, we focus on morphological inflection …