Recent advances in document summarization

J Yao, X Wan, J Xiao - Knowledge and Information Systems, 2017 - Springer
The task of automatic document summarization aims at generating short summaries for
originally long documents. A good summary should cover the most important information of …

Convolutional sequence to sequence learning

J Gehring, M Auli, D Grangier… - International …, 2017 - proceedings.mlr.press
The prevalent approach to sequence to sequence learning maps an input sequence to a
variable length output sequence via recurrent neural networks. We introduce an architecture …

Faithful to the original: Fact aware neural abstractive summarization

Z Cao, F Wei, W Li, S Li - Proceedings of the AAAI Conference on …, 2018 - ojs.aaai.org
Unlike extractive summarization, abstractive summarization has to fuse different parts of the
source text, which inclines to create fake facts. Our preliminary study reveals nearly 30% of …

Multi-temporal land cover classification with sequential recurrent encoders

M Rußwurm, M Körner - ISPRS International Journal of Geo-Information, 2018 - mdpi.com
Earth observation (EO) sensors deliver data at daily or weekly intervals. Most land use and
land cover classification (LULC) approaches, however, are designed for cloud-free and …

Minimum risk training for neural machine translation

S Shen, Y Cheng, Z He, W He, H Wu, M Sun… - arXiv preprint arXiv …, 2015 - arxiv.org
We propose minimum risk training for end-to-end neural machine translation. Unlike
conventional maximum likelihood estimation, minimum risk training is capable of optimizing …

Selective encoding for abstractive sentence summarization

Q Zhou, N Yang, F Wei, M Zhou - arXiv preprint arXiv:1704.07073, 2017 - arxiv.org
We propose a selective encoding model to extend the sequence-to-sequence framework for
abstractive sentence summarization. It consists of a sentence encoder, a selective gate …

Retrieve, rerank and rewrite: Soft template based neural summarization

Z Cao, W Li, F Wei, S Li - 2018 - ira.lib.polyu.edu.hk
Most previous seq2seq summarization systems purely depend on the source text to
generate summaries, which tends to work unstably. Inspired by the traditional template …

Towards explainable NLP: A generative explanation framework for text classification

H Liu, Q Yin, WY Wang - arXiv preprint arXiv:1811.00196, 2018 - arxiv.org
Building explainable systems is a critical problem in the field of Natural Language
Processing (NLP), since most machine learning models provide no explanations for the …

Controlling output length in neural encoder-decoders

Y Kikuchi, G Neubig, R Sasano, H Takamura… - arXiv preprint arXiv …, 2016 - arxiv.org
Neural encoder-decoder models have shown great success in many sequence generation
tasks. However, previous work has not investigated situations in which we would like to …

Learning to extract coherent summary via deep reinforcement learning

Y Wu, B Hu - Proceedings of the AAAI conference on artificial …, 2018 - ojs.aaai.org
Coherence plays a critical role in producing a high-quality summary from a document. In
recent years, neural extractive summarization is becoming increasingly attractive. However …