A survey on text-to-sql parsing: Concepts, methods, and future directions

B Qin, B Hui, L Wang, M Yang, J Li, B Li… - arXiv preprint arXiv …, 2022 - arxiv.org
Text-to-SQL parsing is an essential and challenging task. The goal of text-to-SQL parsing is
to convert a natural language (NL) question to its corresponding structured query language …

Can llm already serve as a database interface? a big bench for large-scale database grounded text-to-sqls

J Li, B Hui, G Qu, J Yang, B Li, B Li… - Advances in …, 2024 - proceedings.neurips.cc
Text-to-SQL parsing, which aims at converting natural language instructions into executable
SQLs, has gained increasing attention in recent years. In particular, GPT-4 and Claude-2 …

Least-to-most prompting enables complex reasoning in large language models

D Zhou, N Schärli, L Hou, J Wei, N Scales… - arXiv preprint arXiv …, 2022 - arxiv.org
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …

Resdsql: Decoupling schema linking and skeleton parsing for text-to-sql

H Li, J Zhang, C Li, H Chen - Proceedings of the AAAI Conference on …, 2023 - ojs.aaai.org
One of the recent best attempts at Text-to-SQL is the pre-trained language model. Due to the
structural property of the SQL queries, the seq2seq model takes the responsibility of parsing …

PICARD: Parsing incrementally for constrained auto-regressive decoding from language models

T Scholak, N Schucher, D Bahdanau - arXiv preprint arXiv:2109.05093, 2021 - arxiv.org
Large pre-trained language models for textual data have an unconstrained output space; at
each decoding step, they can produce any of 10,000 s of sub-word tokens. When fine-tuned …

Compositional exemplars for in-context learning

J Ye, Z Wu, J Feng, T Yu… - … Conference on Machine …, 2023 - proceedings.mlr.press
Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL)
ability, where the model learns to do an unseen task simply by conditioning on a prompt …

Compositional semantic parsing with large language models

A Drozdov, N Schärli, E Akyürek, N Scales… - The Eleventh …, 2022 - openreview.net
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …

Grammar prompting for domain-specific language generation with large language models

B Wang, Z Wang, X Wang, Y Cao… - Advances in Neural …, 2024 - proceedings.neurips.cc
Large language models (LLMs) can learn to perform a wide range of natural language tasks
from just a handful of in-context examples. However, for generating strings from highly …

Graphix-t5: Mixing pre-trained transformers with graph-aware layers for text-to-sql parsing

J Li, B Hui, R Cheng, B Qin, C Ma, N Huo… - Proceedings of the …, 2023 - ojs.aaai.org
The task of text-to-SQL parsing, which aims at converting natural language questions into
executable SQL queries, has garnered increasing attention in recent years. One of the major …

[HTML][HTML] A survey on complex factual question answering

L Zhang, J Zhang, X Ke, H Li, X Huang, Z Shao, S Cao… - AI Open, 2023 - Elsevier
Answering complex factual questions has drawn a lot of attention. Researchers leverage
various data sources to support complex QA, such as unstructured texts, structured …