作者
Jiawen Jiang, Haiyang Zhang, Chenxu Dai, Qingjuan Zhao, Hao Feng, Zhanlin Ji, Ivan Ganchev
发表日期
2021/9/3
期刊
IEEE Access
卷号
9
页码范围
123660-123671
出版商
IEEE
简介
The automatic generation of a text summary is a task of generating a short summary for a relatively long text document by capturing its key information. In the past, supervised statistical machine learning was widely used for the Automatic Text Summarization (ATS) task, but due to its high dependence on the quality of text features, the generated summaries lack accuracy and coherence, while the computational power involved, and performance achieved, could not easily meet the current needs. This paper proposes four novel ATS models with a Sequence-to-Sequence (Seq2Seq) structure, utilizing an attention-based bidirectional Long Short-Term Memory (LSTM), with added enhancements for increasing the correlation between the generated text summary and the source text, and solving the problem of out-of-vocabulary (OOV) words, suppressing the repeated words, and preventing the spread of cumulative …
引用总数
学术搜索中的文章