Interactive Teaching For Conversational AI · The Large Language Model Bible Contribute to LLM-Bible

Interactive Teaching For Conversational AI

Ping Qing, Niu Feiyang, Thattai Govind, Chengottusseriyil Joel, Gao Qiaozi, Reganti Aishwarya, Rajagopal Prashanth, Tur Gokhan, Hakkani-tur Dilek, Nataraja Prem. Arxiv 2020

[Paper]    
Model Architecture Pretraining Methods Transformer

Current conversational AI systems aim to understand a set of pre-designed requests and execute related actions, which limits them to evolve naturally and adapt based on human interactions. Motivated by how children learn their first language interacting with adults, this paper describes a new Teachable AI system that is capable of learning new language nuggets called concepts, directly from end users using live interactive teaching sessions. The proposed setup uses three models to: a) Identify gaps in understanding automatically during live conversational interactions, b) Learn the respective interpretations of such unknown concepts from live interactions with users, and c) Manage a classroom sub-dialogue specifically tailored for interactive teaching sessions. We propose state-of-the-art transformer based neural architectures of models, fine-tuned on top of pre-trained models, and show accuracy improvements on the respective components. We demonstrate that this method is very promising in leading way to build more adaptive and personalized language understanding models.

Similar Work