Paraphrase generation: A survey of the state of the art

J Zhou, S Bhat - Proceedings of the 2021 conference on empirical …, 2021 - aclanthology.org
This paper focuses on paraphrase generation, which is a widely studied natural language
generation task in NLP. With the development of neural models, paraphrase generation …

Improving the robustness of question answering systems to question paraphrasing

WC Gan, HT Ng - Proceedings of the 57th annual meeting of the …, 2019 - aclanthology.org
Despite the advancement of question answering (QA) systems and rapid improvements on
held-out test sets, their generalizability is a topic of concern. We explore the robustness of …

Neural syntactic preordering for controlled paraphrase generation

T Goyal, G Durrett - arXiv preprint arXiv:2005.02013, 2020 - arxiv.org
Paraphrasing natural language sentences is a multifaceted process: it might involve
replacing individual words or short phrases, local rearrangement of content, or high-level …

One size does not fit all: Generating and evaluating variable number of keyphrases

X Yuan, T Wang, R Meng, K Thaker… - arXiv preprint arXiv …, 2018 - arxiv.org
Different texts shall by nature correspond to different number of keyphrases. This
desideratum is largely missing from existing neural keyphrase generation models. In this …

Exploring metaphoric paraphrase generation

K Stowe, N Beck, I Gurevych - Proceedings of the 25th conference …, 2021 - aclanthology.org
Metaphor generation is a difficult task, and has seen tremendous improvement with the
advent of deep pretrained models. We focus here on the specific task of metaphoric …

DivGAN: Towards diverse paraphrase generation via diversified generative adversarial network

Y Cao, X Wan - Findings of the Association for Computational …, 2020 - aclanthology.org
Paraphrases refer to texts that convey the same meaning with different expression forms.
Traditional seq2seq-based models on paraphrase generation mainly focus on the fidelity …

Unsupervised paraphrase generation using pre-trained language models

C Hegde, S Patil - arXiv preprint arXiv:2006.05477, 2020 - arxiv.org
Large scale Pre-trained Language Models have proven to be very powerful approach in
various Natural language tasks. OpenAI's GPT-2\cite {radford2019language} is notable for …

User utterance acquisition for training task-oriented bots: a review of challenges, techniques and opportunities

MA Yaghoub-Zadeh-Fard, B Benatallah… - IEEE Internet …, 2020 - ieeexplore.ieee.org
Building conversational task-oriented bots requires large and diverse sets of annotated user
utterances to learn mappings between natural language utterances and user intents. Given …

ConRPG: Paraphrase generation using contexts as regularizer

Y Meng, X Ao, Q He, X Sun, Q Han, F Wu, J Li - arXiv preprint arXiv …, 2021 - arxiv.org
A long-standing issue with paraphrase generation is how to obtain reliable supervision
signals. In this paper, we propose an unsupervised paradigm for paraphrase generation …

Generating diverse translations with sentence codes

R Shu, H Nakayama, K Cho - … of the 57th annual meeting of the …, 2019 - aclanthology.org
Users of machine translation systems may desire to obtain multiple candidates translated in
different ways. In this work, we attempt to obtain diverse translations by using sentence …