Neural machine translation: A review

F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …

Survey of low-resource machine translation

B Haddow, R Bawden, AVM Barone, J Helcl… - Computational …, 2022 - direct.mit.edu
We present a survey covering the state of the art in low-resource machine translation (MT)
research. There are currently around 7,000 languages spoken in the world and almost all …

When and why are pre-trained word embeddings useful for neural machine translation?

Y Qi, DS Sachan, M Felix, SJ Padmanabhan… - arXiv preprint arXiv …, 2018 - arxiv.org
The performance of Neural Machine Translation (NMT) systems often suffers in low-resource
scenarios where sufficiently large-scale parallel corpora cannot be obtained. Pre-trained …

Revisiting low-resource neural machine translation: A case study

R Sennrich, B Zhang - arXiv preprint arXiv:1905.11901, 2019 - arxiv.org
It has been shown that the performance of neural machine translation (NMT) drops starkly in
low-resource conditions, underperforming phrase-based statistical machine translation …

Overview of the 8th workshop on Asian translation

T Nakazawa, H Nakayama, C Ding… - Proceedings of the …, 2021 - aclanthology.org
This paper presents the results of the shared tasks from the 8th workshop on Asian
translation (WAT2021). For the WAT2021, 28 teams participated in the shared tasks and 24 …

Privformer: Privacy-preserving transformer with mpc

Y Akimoto, K Fukuchi, Y Akimoto… - 2023 IEEE 8th …, 2023 - ieeexplore.ieee.org
The Transformer is a deep learning architecture that processes sequence data. The
Transformer attains the state-of-the-art in several tasks of sequence data analysis, and its …

Universal multimodal representation for language understanding

Z Zhang, K Chen, R Wang, M Utiyama… - … on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Representation learning is the foundation of natural language processing (NLP). This work
presents new methods to employ visual information as assistant signals to general NLP …

Logentext: Automatically generating logging texts using neural machine translation

Z Ding, H Li, W Shang - 2022 IEEE International Conference on …, 2022 - ieeexplore.ieee.org
The textual descriptions in logging statements (ie, logging texts) are printed during system
executions and exposed to multiple stakeholders including developers, operators, users …

LoGenText-Plus: Improving Neural Machine Translation Based Logging Texts Generation with Syntactic Templates

Z Ding, Y Tang, X Cheng, H Li, W Shang - ACM Transactions on …, 2023 - dl.acm.org
Developers insert logging statements in the source code to collect important runtime
information about software systems. The textual descriptions in logging statements (ie …

Subword-level word vector representations for Korean

S Park, J Byun, S Baek, Y Cho, A Oh - Proceedings of the 56th …, 2018 - aclanthology.org
Research on distributed word representations is focused on widely-used languages such as
English. Although the same methods can be used for other languages, language-specific …