A Araabi, C Monz - arXiv preprint arXiv:2011.02266, 2020 - arxiv.org
Language pairs with limited amounts of parallel data, also known as low-resource languages, remain a challenge for neural machine translation. While the Transformer model …
G Yang, X Chen, Y Zhou, C Yu - 2022 IEEE International …, 2022 - ieeexplore.ieee.org
A shellcode is a small piece of code and it is executed to exploit a software vulnerability, which allows the target computer to execute arbitrary commands from the attacker through a …
A" bigger is better" explosion in the number of parameters in deep neural networks has made it increasingly challenging to make state-of-the-art networks accessible in compute …
A Henry, PR Dachapally, S Pawar, Y Chen - arXiv preprint arXiv …, 2020 - arxiv.org
Low-resource language translation is a challenging but socially valuable NLP task. Building on recent work adapting the Transformer's normalization to this setting, we propose …
Semantic parsing using sequence-to-sequence models allows parsing of deeper representations compared to traditional word tagging based models. In spite of these …
The Transformer model is the state-of-the-art in Machine Translation. However, in general, neural translation models often under perform on language pairs with insufficient training …
In this study, a human evaluation is carried out on how hyperparameter settings impact the quality of Transformer-based Neural Machine Translation (NMT) for the low-resourced …
We study the role of an essential hyper-parameter that governs the training of Transformers for neural machine translation in a low-resource setting: the batch size. Using theoretical …
J Rajab - 3rd Workshop on African Natural Language Processing, 2022 - openreview.net
Research into machine translation for African languages is very limited and low-resourced in terms of datasets and model evaluations. This work aims to add to the field of neural …