Text summarization with pretrained encoders

Y Liu, M Lapata - arXiv preprint arXiv:1908.08345, 2019 - arxiv.org
Bidirectional Encoder Representations from Transformers (BERT) represents the latest
incarnation of pretrained language models which have recently advanced a wide range of …

QuestEval: Summarization asks for fact-based evaluation

T Scialom, PA Dray, P Gallinari, S Lamprier… - arXiv preprint arXiv …, 2021 - arxiv.org
Summarization evaluation remains an open research problem: current metrics such as
ROUGE are known to be limited and to correlate poorly with human judgments. To alleviate …

Pretraining-based natural language generation for text summarization

H Zhang, J Xu, J Wang - arXiv preprint arXiv:1902.09243, 2019 - arxiv.org
In this paper, we propose a novel pretraining-based encoder-decoder framework, which can
generate the output sequence based on the input sequence in a two-stage manner. For the …

Extractive summarization of long documents by combining global and local context

W Xiao, G Carenini - arXiv preprint arXiv:1909.08089, 2019 - arxiv.org
In this paper, we propose a novel neural single document extractive summarization model
for long documents, incorporating both the global context of the whole document and the …

Leveraging graph to improve abstractive multi-document summarization

W Li, X Xiao, J Liu, H Wu, H Wang, J Du - arXiv preprint arXiv:2005.10043, 2020 - arxiv.org
Graphs that capture relations between textual units have great benefits for detecting salient
information from multiple documents and generating overall coherent summaries. In this …

Conditional generation with a question-answering blueprint

S Narayan, J Maynez, RK Amplayo… - Transactions of the …, 2023 - direct.mit.edu
The ability to convey relevant and faithful information is critical for many tasks in conditional
generation and yet remains elusive for neural seq-to-seq models whose outputs often reveal …

Single-Document Abstractive Text Summarization: A Systematic Literature Review

A Rao, S Aithal, S Singh - ACM Computing Surveys, 2024 - dl.acm.org
Abstractive text summarization is a task in natural language processing that automatically
generates the summary from the source document in a human-written form with minimal loss …

Sample efficient text summarization using a single pre-trained transformer

U Khandelwal, K Clark, D Jurafsky, L Kaiser - arXiv preprint arXiv …, 2019 - arxiv.org
Language model (LM) pre-training has resulted in impressive performance and sample
efficiency on a variety of language understanding tasks. However, it remains unclear how to …

BASS: Boosting abstractive summarization with unified semantic graph

W Wu, W Li, X Xiao, J Liu, Z Cao, S Li, H Wu… - arXiv preprint arXiv …, 2021 - arxiv.org
Abstractive summarization for long-document or multi-document remains challenging for the
Seq2Seq architecture, as Seq2Seq is not good at analyzing long-distance relations in text …

Generating representative headlines for news stories

X Gu, Y Mao, J Han, J Liu, Y Wu, C Yu… - Proceedings of The …, 2020 - dl.acm.org
Millions of news articles are published online every day, which can be overwhelming for
readers to follow. Grouping articles that are reporting the same event into news stories is a …