Participatory research for low-resourced machine translation: A case study in african languages

W Nekoto, V Marivate, T Matsila, T Fasubaa… - arXiv preprint arXiv …, 2020 - arxiv.org
Research in NLP lacks geographic diversity, and the question of how NLP can be scaled to
low-resourced languages has not yet been adequately solved." Low-resourced"-ness is a …

Optimizing transformer for low-resource neural machine translation

A Araabi, C Monz - arXiv preprint arXiv:2011.02266, 2020 - arxiv.org
Language pairs with limited amounts of parallel data, also known as low-resource
languages, remain a challenge for neural machine translation. While the Transformer model …

Dualsc: Automatic generation and summarization of shellcode via transformer and dual learning

G Yang, X Chen, Y Zhou, C Yu - 2022 IEEE International …, 2022 - ieeexplore.ieee.org
A shellcode is a small piece of code and it is executed to exploit a software vulnerability,
which allows the target computer to execute arbitrary commands from the attacker through a …

The low-resource double bind: An empirical study of pruning for low-resource machine translation

O Ahia, J Kreutzer, S Hooker - arXiv preprint arXiv:2110.03036, 2021 - arxiv.org
A" bigger is better" explosion in the number of parameters in deep neural networks has
made it increasingly challenging to make state-of-the-art networks accessible in compute …

Query-key normalization for transformers

A Henry, PR Dachapally, S Pawar, Y Chen - arXiv preprint arXiv …, 2020 - arxiv.org
Low-resource language translation is a challenging but socially valuable NLP task. Building
on recent work adapting the Transformer's normalization to this setting, we propose …

Non-autoregressive semantic parsing for compositional task-oriented dialog

A Babu, A Shrivastava, A Aghajanyan, A Aly… - arXiv preprint arXiv …, 2021 - arxiv.org
Semantic parsing using sequence-to-sequence models allows parsing of deeper
representations compared to traditional word tagging based models. In spite of these …

Transformers for Low-Resource Languages: Is F\'eidir Linn!

S Lankford, H Afli, A Way - arXiv preprint arXiv:2403.01985, 2024 - arxiv.org
The Transformer model is the state-of-the-art in Machine Translation. However, in general,
neural translation models often under perform on language pairs with insufficient training …

Human evaluation of English–Irish transformer-based NMT

S Lankford, H Afli, A Way - Information, 2022 - mdpi.com
In this study, a human evaluation is carried out on how hyperparameter settings impact the
quality of Transformer-based Neural Machine Translation (NMT) for the low-resourced …

Small batch sizes improve training of low-resource neural mt

ÀR Atrio, A Popescu-Belis - arXiv preprint arXiv:2203.10579, 2022 - arxiv.org
We study the role of an essential hyper-parameter that governs the training of Transformers
for neural machine translation in a low-resource setting: the batch size. Using theoretical …

Effect of tokenisation strategies for low-resourced Southern African languages

J Rajab - 3rd Workshop on African Natural Language Processing, 2022 - openreview.net
Research into machine translation for African languages is very limited and low-resourced in
terms of datasets and model evaluations. This work aims to add to the field of neural …