Divide And Prompt: Chain Of Thought Prompting For Text-to-sql · The Large Language Model Bible Contribute to LLM-Bible

Divide And Prompt: Chain Of Thought Prompting For Text-to-sql

Liu Xiping, Tan Zhao. Arxiv 2023

[Paper]    
Prompting RAG

Chain-of-thought (CoT) prompting combined with large language models (LLMs) have achieved encouraging results on complex reasoning tasks. Text-to-SQL is a critical semantic parsing task that converts natural language questions into SQL statements, involving a complex reasoning process. However, there is little work about using CoT prompting to activate LLM’s reasoning capabilities on Text-to-SQL tasks. In this work, we propose a new paradigm for prompting Text-to-SQL tasks, called Divide-and-Prompt, which first divides the task into subtasks, and then approach each subtask through CoT. We present 3 prompting-based methods to enhance the Text-to-SQL ability of LLMs. Experiments show that these prompts guide LLMs to generate Text-to-SQL with higher execution accuracy.

Similar Work