Survey of hallucination in natural language generation

Z Ji, N Lee, R Frieske, T Yu, D Su, Y Xu, E Ishii… - ACM Computing …, 2023 - dl.acm.org
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …

A review on question generation from natural language text

R Zhang, J Guo, L Chen, Y Fan, X Cheng - ACM Transactions on …, 2021 - dl.acm.org
Question generation is an important yet challenging problem in Artificial Intelligence (AI),
which aims to generate natural and relevant questions from various input formats, eg …

Event extraction as machine reading comprehension

J Liu, Y Chen, K Liu, W Bi, X Liu - Proceedings of the 2020 …, 2020 - aclanthology.org
Event extraction (EE) is a crucial information extraction task that aims to extract event
information in texts. Previous methods for EE typically model it as a classification task, which …

Mass: Masked sequence to sequence pre-training for language generation

K Song, X Tan, T Qin, J Lu, TY Liu - arXiv preprint arXiv:1905.02450, 2019 - arxiv.org
Pre-training and fine-tuning, eg, BERT, have achieved great success in language
understanding by transferring knowledge from rich-resource pre-training task to the low/zero …

Paragraph-level neural question generation with maxout pointer and gated self-attention networks

Y Zhao, X Ni, Y Ding, Q Ke - … of the 2018 conference on empirical …, 2018 - aclanthology.org
Question generation, the task of automatically creating questions that can be answered by a
certain span of text within a given passage, is important for question-answering and …

Deep reinforcement learning for sequence-to-sequence models

Y Keneshloo, T Shi, N Ramakrishnan… - IEEE transactions on …, 2019 - ieeexplore.ieee.org
In recent times, sequence-to-sequence (seq2seq) models have gained a lot of popularity
and provide state-of-the-art performance in a wide variety of tasks, such as machine …

Reinforcement learning based graph-to-sequence model for natural question generation

Y Chen, L Wu, MJ Zaki - arXiv preprint arXiv:1908.04942, 2019 - arxiv.org
Natural question generation (QG) aims to generate questions from a passage and an
answer. Previous works on QG either (i) ignore the rich structure information hidden in …

A recurrent BERT-based model for question generation

YH Chan, YC Fan - Proceedings of the 2nd workshop on machine …, 2019 - aclanthology.org
In this study, we investigate the employment of the pre-trained BERT language model to
tackle question generation tasks. We introduce three neural architectures built on top of …

Unsupervised question answering by cloze translation

P Lewis, L Denoyer, S Riedel - arXiv preprint arXiv:1906.04980, 2019 - arxiv.org
Obtaining training data for Question Answering (QA) is time-consuming and resource-
intensive, and existing QA datasets are only available for limited domains and languages. In …

Addressing semantic drift in question generation for semi-supervised question answering

S Zhang, M Bansal - arXiv preprint arXiv:1909.06356, 2019 - arxiv.org
Text-based Question Generation (QG) aims at generating natural and relevant questions
that can be answered by a given answer in some context. Existing QG models suffer from a" …