C Liang, W Wang, T Zhou… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Abductive reasoning seeks the likeliest possible explanation for partial observations. Although abduction is frequently employed in human daily reasoning, it is rarely explored in …
Despite recent progress of pre-trained language models on generating fluent text, existing methods still suffer from incoherence problems in long-form text generation tasks that …
X He - arXiv preprint arXiv:2109.12487, 2021 - arxiv.org
Lexically constrained text generation aims to control the generated text by incorporating some pre-specified keywords into the output. Previous work injects lexical constraints into …
Representation learning for text via pretraining a language model on a large corpus has become a standard starting point for building NLP systems. This approach stands in contrast …
Storytelling has always been vital for human nature. From ancient times, humans have used stories for several objectives including entertainment, advertisement, and education. Various …
This paper proposes a new self-attention based model for music score infilling, ie, to generate a polyphonic music sequence that fills in the gap between given past and future …
M Roemmele - arXiv preprint arXiv:2107.04007, 2021 - arxiv.org
Getting machines to generate text perceived as creative is a long-pursued goal. A growing body of research directs this goal towards augmenting the creative writing abilities of human …
One of the most challenging topics in Natural Language Processing (NLP) is visually- grounded language understanding and reasoning. Outdoor vision-and-language navigation …
Story ideation is a critical part of the story-writing process. It is challenging to support computationally due to its exploratory and subjective nature. Tropes, which are recurring …