Neural network-based sequence-to-sequence (seq2seq) models strongly suffer from the low- diversity problem when it comes to open-domain dialogue generation. As bland and generic …
The neural attention model has achieved great success in data-to-text generation tasks. Though usually excelling at producing fluent text, it suffers from the problem of information …
Recent advancements in data-to-text generation largely take on the form of neural end-to- end systems. Efforts have been dedicated to improving text generation systems by changing …
Neural natural language generation (NLG) and understanding (NLU) models are data- hungry and require massive amounts of annotated data to be competitive. Recent …
J Ge, Y Huang, X Shen, C Li… - IEEE/ACM Transactions on …, 2021 - ieeexplore.ieee.org
Automatically recommending relevant law articles to a given legal case has attracted much attention as it can greatly release human labor from searching over the large database of …
Product question answering (PQA) aims to automatically address customer questions to improve their online shopping experience. Current research mainly focuses on finding …
E Chang, J Caplinger, A Marin, X Shen… - arXiv preprint arXiv …, 2020 - arxiv.org
We present a lightweight annotation tool, the Data AnnotatoR Tool (DART), for the general task of labeling structured data with textual descriptions. The tool is implemented as an …
Developing effective spoken language processing systems for low-resource languages poses several challenges due to the lack of parallel data and limited resources for fine …
X Shen - arXiv preprint arXiv:2203.02055, 2022 - arxiv.org
Text generation aims to produce human-like natural language output for down-stream tasks. It covers a wide range of applications like machine translation, document summarization …