Compositional semantic parsing with large language models

A Drozdov, N Schärli, E Akyürek, N Scales… - The Eleventh …, 2022 - openreview.net
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …

The devil is in the detail: Simple tricks improve systematic generalization of transformers

R Csordás, K Irie, J Schmidhuber - arXiv preprint arXiv:2108.12284, 2021 - arxiv.org
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …

Natural SQL: Making SQL easier to infer from natural language specifications

Y Gan, X Chen, J Xie, M Purver, JR Woodward… - arXiv preprint arXiv …, 2021 - arxiv.org
Addressing the mismatch between natural language descriptions and the corresponding
SQL queries is a key challenge for text-to-SQL translation. To bridge this gap, we propose …

Improving compositional generalization with latent structure and data augmentation

L Qiu, P Shaw, P Pasupat, PK Nowak, T Linzen… - arXiv preprint arXiv …, 2021 - arxiv.org
Generic unstructured neural networks have been shown to struggle on out-of-distribution
compositional generalization. Compositional data augmentation via example recombination …

Making transformers solve compositional tasks

S Ontanon, J Ainslie, V Cvicek, Z Fisher - arXiv preprint arXiv:2108.04378, 2021 - arxiv.org
Several studies have reported the inability of Transformer models to generalize
compositionally, a key type of generalization in many NLP tasks such as semantic parsing …

How Do In-Context Examples Affect Compositional Generalization?

S An, Z Lin, Q Fu, B Chen, N Zheng, JG Lou… - arXiv preprint arXiv …, 2023 - arxiv.org
Compositional generalization--understanding unseen combinations of seen primitives--is an
essential reasoning capability in human intelligence. The AI community mainly studies this …

Consistency regularization training for compositional generalization

Y Yin, J Zeng, Y Li, F Meng, J Zhou… - Proceedings of the 61st …, 2023 - aclanthology.org
Existing neural models have difficulty generalizing to unseen combinations of seen
components. To achieve compositional generalization, models are required to consistently …

Modern baselines for SPARQL semantic parsing

D Banerjee, PA Nair, JN Kaur, R Usbeck… - Proceedings of the 45th …, 2022 - dl.acm.org
In this work, we focus on the task of generating SPARQL queries from natural language
questions, which can then be executed on Knowledge Graphs (KGs). We assume that gold …

Reason first, then respond: Modular generation for knowledge-infused dialogue

L Adolphs, K Shuster, J Urbanek, A Szlam… - arXiv preprint arXiv …, 2021 - arxiv.org
Large language models can produce fluent dialogue but often hallucinate factual
inaccuracies. While retrieval-augmented models help alleviate this issue, they still face a …

Uncontrolled lexical exposure leads to overestimation of compositional generalization in pretrained models

N Kim, T Linzen, P Smolensky - arXiv preprint arXiv:2212.10769, 2022 - arxiv.org
Human linguistic capacity is often characterized by compositionality and the generalization it
enables--human learners can produce and comprehend novel complex expressions by …