Empowering Private Tutoring By Chaining Large Language Models · The Large Language Model Bible Contribute to LLM-Bible

Empowering Private Tutoring By Chaining Large Language Models

Yulin Chen, Ning Ding, Hai-tao Zheng, Zhiyuan Liu, Maosong Sun, Bowen Zhou. Arxiv 2023

[Paper]    
Prompting RAG Tools

Artificial intelligence has been applied in various aspects of online education to facilitate teaching and learning. However, few approaches has been made toward a complete AI-powered tutoring system. In this work, we explore the development of a full-fledged intelligent tutoring system powered by state-of-the-art large language models (LLMs), covering automatic course planning and adjusting, tailored instruction, and flexible quiz evaluation. To make the system robust to prolonged interaction and cater to individualized education, the system is decomposed into three inter-connected core processes-interaction, reflection, and reaction. Each process is implemented by chaining LLM-powered tools along with dynamically updated memory modules. Tools are LLMs prompted to execute one specific task at a time, while memories are data storage that gets updated during education process. Statistical results from learning logs demonstrate the effectiveness and mechanism of each tool usage. Subjective feedback from human users reveal the usability of each function, and comparison with ablation systems further testify the benefits of the designed processes in long-term interaction.

Similar Work