T Li, K Murray - arXiv preprint arXiv:2305.17325, 2023 - arxiv.org
Zero-shot cross-lingual transfer is when a multilingual model is trained to perform a task in one language and then is applied to another language. Although the zero-shot cross-lingual …
Multilingual language models have pushed state-of-the-art in cross-lingual NLP transfer. The majority of zero-shot cross-lingual transfer, however, use one and the same massively …
Recent model pruning methods have demonstrated the ability to remove redundant parameters without sacrificing model performance. Common methods remove redundant …
The rapid development and real-world adoption of natural language processing models in recent years underscores the imperative to develop such models in different languages. This …
Incorporating language-specific (LS) modules is a proven method to boost performance in multilingual machine translation. This approach bears similarity to Mixture-of-Experts (MoE) …
P Kuchmiichuk - Proceedings of the Second Ukrainian Natural …, 2023 - aclanthology.org
Low-resource languages continue to present challenges for current NLP methods, and multilingual NLP is gaining attention in the research community. One of the main issues is …
Advancements in machine learning have revolutionized natural language processing, enabling the development of models that can understand and generate multiple languages …