For many new application domains for data-to-text generation, the main obstacle in training neural models consists of a lack of training data. While usually large numbers of instances …
Large-scale pretrained language models have led to dramatic improvements in text generation. Impressive performance can be achieved by finetuning only on a small number …
Recent advancements in data-to-text generation largely take on the form of neural end-to- end systems. Efforts have been dedicated to improving text generation systems by changing …
Objective Weak supervision holds significant promise to improve clinical natural language processing by leveraging domain resources and expertise instead of large manually …
This paper explores deep latent variable models for semi-supervised paraphrase generation, where the missing target pair for unlabelled data is modelled as a latent …
G Yan, J Pei, P Ren, Z Ren, X Xin, H Liang… - Proceedings of the 45th …, 2022 - dl.acm.org
\AcpMDS aim to assist doctors and patients with a range of professional medical services, ie, diagnosis, treatment and consultation. The development of\acpMDS is hindered because of …
This study discusses the effect of semi-supervised learning in combination with pretrained language models for data-to-text generation. It is not known whether semi-supervised …
West African Pidgin English is a language that is significantly spoken in West Africa, consisting of at least 75 million speakers. Nevertheless, proper machine translation systems …
W Zheng, N Milic-Frayling, K Zhou - arXiv preprint arXiv:2305.18200, 2023 - arxiv.org
Incorporating conversational context and knowledge into dialogue generation models has been essential for improving the quality of the generated responses. The context, comprising …