G Luo, YT Han, L Mou, M Firdaus - arXiv preprint arXiv:2301.11997, 2023 - arxiv.org
Prompting approaches have been recently explored in text style transfer, where a textual prompt is used to query a pretrained language model to generate style-transferred texts …
The goal of compositional generalization benchmarks is to evaluate how well models generalize to new complex linguistic expressions. Existing benchmarks often focus on …
We present Referee, a novel framework for sentence summarization that can be trained reference-free (ie, requiring no gold summaries for supervision), while allowing direct control …
C Yuan, H Huang, Y Cao, Q Cao - Information Processing & Management, 2024 - Elsevier
Lexically constrained text generation (CTG) is to generate text that contains given constrained keywords. However, the text diversity of existing models is still unsatisfactory. In …
In Natural Language Processing (NLP), predicting linguistic structures, such as parsing and chunking, has mostly relied on manual annotations of syntactic structures. This paper …
Long sentences have been a persistent issue in written communication for many years since they make it challenging for readers to grasp the main points or follow the initial intention of …
Sentence summarization shortens given texts while maintaining core contents of the texts. Unsupervised approaches have been studied to summarize texts without human-written …
P Liu, X Zhang, L Mou - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Sentence summarization aims at compressing a long sentence into a short one that keeps the main gist, and has extensive real-world applications such as headline generation. In …
P Guo, Y Xiao, J Li, M Zhang - Proceedings of the AAAI Conference on …, 2023 - ojs.aaai.org
Non-autoregressive neural machine translation (NAT) models are proposed to accelerate the inference process while maintaining relatively high performance. However, existing NAT …