The prompt-based learning paradigm, which bridges the gap between pre-training and fine- tuning, achieves state-of-the-art performance on several NLP tasks, particularly in few-shot …
Cross-lingual summarization is the task of generating a summary in one language (eg, English) for the given document (s) in a different language (eg, Chinese). Under the …
Given a document in a source language, cross-lingual summarization (CLS) aims to generate a summary in a different target language. Recently, the emergence of Large …
The goal of the cross-lingual summarization (CLS) is to convert a document in one language (eg, English) to a summary in another one (eg, Chinese). Essentially, the CLS task is the …
Large Language Models (LLMs), which bridge the gap between human language understanding and complex problem-solving, achieve state-of-the-art performance on …
To adapt text summarization to the multilingual world, previous work proposes multi-lingual summarization (MLS) and cross-lingual summarization (CLS). However, these two tasks …
S Zhao, M Jia, LA Tuan, F Pan… - arXiv preprint arXiv …, 2024 - researchgate.net
In-context learning, a paradigm bridging the gap between pre-training and fine-tuning, has demonstrated high efficacy in several NLP tasks, especially in few-shot settings. Despite …
Z Huang, P Yu, J Allan - Proceedings of the Sixteenth ACM International …, 2023 - dl.acm.org
Benefiting from transformer-based pre-trained language models, neural ranking models have made significant progress. More recently, the advent of multilingual pre-trained …
Given a document in a source language, cross-lingual summarization (CLS) aims at generating a concise summary in a different target language. Unlike monolingual …