A survey of knowledge-enhanced text generation

W Yu, C Zhu, Z Li, Z Hu, Q Wang, H Ji… - ACM Computing …, 2022 - dl.acm.org
The goal of text-to-text generation is to make machines express like a human in many
applications such as conversation, summarization, and translation. It is one of the most …

Recent advances in retrieval-augmented text generation

D Cai, Y Wang, L Liu, S Shi - Proceedings of the 45th international ACM …, 2022 - dl.acm.org
Recently retrieval-augmented text generation has achieved state-of-the-art performance in
many NLP tasks and has attracted increasing attention of the NLP and IR community, this …

What Makes Good In-Context Examples for GPT-?

J Liu, D Shen, Y Zhang, B Dolan, L Carin… - arXiv preprint arXiv …, 2021 - arxiv.org
GPT-$3 $ has attracted lots of attention due to its superior performance across a wide range
of NLP tasks, especially with its powerful and versatile in-context few-shot learning ability …

Reformulating unsupervised style transfer as paraphrase generation

K Krishna, J Wieting, M Iyyer - arXiv preprint arXiv:2010.05700, 2020 - arxiv.org
Modern NLP defines the task of style transfer as modifying the style of a given sentence
without appreciably changing its semantics, which implies that the outputs of style transfer …

ToTTo: A controlled table-to-text generation dataset

AP Parikh, X Wang, S Gehrmann, M Faruqui… - arXiv preprint arXiv …, 2020 - arxiv.org
We present ToTTo, an open-domain English table-to-text dataset with over 120,000 training
examples that proposes a controlled generation task: given a Wikipedia table and a set of …

Lift yourself up: Retrieval-augmented text generation with self-memory

X Cheng, D Luo, X Chen, L Liu… - Advances in Neural …, 2024 - proceedings.neurips.cc
With direct access to human-written reference as memory, retrieval-augmented generation
has achieved much progress in a wide range of text generation tasks. Since better memory …

A survey on retrieval-augmented text generation

H Li, Y Su, D Cai, Y Wang, L Liu - arXiv preprint arXiv:2202.01110, 2022 - arxiv.org
Recently, retrieval-augmented text generation attracted increasing attention of the
computational linguistics community. Compared with conventional generation models …

Compound probabilistic context-free grammars for grammar induction

Y Kim, C Dyer, AM Rush - arXiv preprint arXiv:1906.10225, 2019 - arxiv.org
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …

Tailor: Generating and perturbing text with semantic controls

A Ross, T Wu, H Peng, ME Peters… - arXiv preprint arXiv …, 2021 - arxiv.org
Controlled text perturbation is useful for evaluating and improving model generalizability.
However, current techniques rely on training a model for every target perturbation, which is …

Multilingual generative language models for zero-shot cross-lingual event argument extraction

KH Huang, I Hsu, P Natarajan, KW Chang… - arXiv preprint arXiv …, 2022 - arxiv.org
We present a study on leveraging multilingual pre-trained generative language models for
zero-shot cross-lingual event argument extraction (EAE). By formulating EAE as a language …