Neuro-symbolic Procedural Planning With Commonsense Prompting · The Large Language Model Bible Contribute to LLM-Bible

Neuro-symbolic Procedural Planning With Commonsense Prompting

Lu Yujie, Feng Weixi, Zhu Wanrong, Xu Wenda, Wang Xin Eric, Eckstein Miguel, Wang William Yang. Arxiv 2022

[Paper]    
Prompting Training Techniques

Procedural planning aims to implement complex high-level goals by decomposition into sequential simpler low-level steps. Although procedural planning is a basic skill set for humans in daily life, it remains a challenge for large language models (LLMs) that lack a deep understanding of the cause-effect relations in procedures. Previous methods require manual exemplars to acquire procedural planning knowledge from LLMs in the zero-shot setting. However, such elicited pre-trained knowledge in LLMs induces spurious correlations between goals and steps, which impair the model generalization to unseen tasks. In contrast, this paper proposes a neuro-symbolic procedural PLANner (PLAN) that elicits procedural planning knowledge from the LLMs with commonsense-infused prompting. To mitigate spurious goal-step correlations, we use symbolic program executors on the latent procedural representations to formalize prompts from commonsense knowledge bases as a causal intervention toward the Structural Causal Model. Both automatic and human evaluations on WikiHow and RobotHow show the superiority of PLAN on procedural planning without further training or manual exemplars.

Similar Work