Z Mao, C Chu, S Kurohashi - IEEE/ACM Transactions on Audio …, 2024 - ieeexplore.ieee.org
Massively multilingual sentence representation models, eg, LASER, SBERT-distill, and LaBSE, help significantly improve cross-lingual downstream tasks. However, the use of a …
K Smaïli, D Langlois, P Pribil - Fifth International Conference on …, 2022 - hal.science
More than 13 million people suffer a stroke each year. Aphasia is known as a language disorder usually caused by a stroke that damages a specific area of the brain that controls …
R Dabre, A Fujita - arXiv preprint arXiv:2009.09372, 2020 - arxiv.org
Neural machine translation (NMT) models are typically trained using a softmax cross- entropy loss where the softmax distribution is compared against smoothed gold labels. In …
R Dabre, A Fujita - … of Machine Translation Summit XVIII: Research …, 2021 - aclanthology.org
Neural machine translation (NMT) models are typically trained using a softmax cross- entropy loss where the softmax distribution is compared against the gold labels. In low …
R Dabre, A Fujita - Proceedings of the Fifth Conference on …, 2020 - aclanthology.org
In neural machine translation (NMT), sequence distillation (SD) through creation of distilled corpora leads to efficient (compact and fast) models. However, its effectiveness in extremely …
In a diverse linguistic landscape where over 7,100 languages are spoken, vast swathes of digital content remain isolated within language silos, creating significant barriers to global …
Neural machine translation (NMT)[1] is known to give state-of-the-art translations for a variety of language pairs. Sub-word segmentation [2, 3] is one of the key reasons behind it …