WC Gan, HT Ng - Proceedings of the 57th annual meeting of the …, 2019 - aclanthology.org
Despite the advancement of question answering (QA) systems and rapid improvements on held-out test sets, their generalizability is a topic of concern. We explore the robustness of …
T Goyal, G Durrett - arXiv preprint arXiv:2005.02013, 2020 - arxiv.org
Paraphrasing natural language sentences is a multifaceted process: it might involve replacing individual words or short phrases, local rearrangement of content, or high-level …
Different texts shall by nature correspond to different number of keyphrases. This desideratum is largely missing from existing neural keyphrase generation models. In this …
Metaphor generation is a difficult task, and has seen tremendous improvement with the advent of deep pretrained models. We focus here on the specific task of metaphoric …
Y Cao, X Wan - Findings of the Association for Computational …, 2020 - aclanthology.org
Paraphrases refer to texts that convey the same meaning with different expression forms. Traditional seq2seq-based models on paraphrase generation mainly focus on the fidelity …
C Hegde, S Patil - arXiv preprint arXiv:2006.05477, 2020 - arxiv.org
Large scale Pre-trained Language Models have proven to be very powerful approach in various Natural language tasks. OpenAI's GPT-2\cite {radford2019language} is notable for …
Building conversational task-oriented bots requires large and diverse sets of annotated user utterances to learn mappings between natural language utterances and user intents. Given …
A long-standing issue with paraphrase generation is how to obtain reliable supervision signals. In this paper, we propose an unsupervised paradigm for paraphrase generation …
R Shu, H Nakayama, K Cho - … of the 57th annual meeting of the …, 2019 - aclanthology.org
Users of machine translation systems may desire to obtain multiple candidates translated in different ways. In this work, we attempt to obtain diverse translations by using sentence …