SOLOIST: Building Task Bots At Scale With Transfer Learning And Machine Teaching · The Large Language Model Bible Contribute to LLM-Bible

SOLOIST: Building Task Bots At Scale With Transfer Learning And Machine Teaching

Peng Baolin, Li Chunyuan, Li Jinchao, Shayandeh Shahin, Liden Lars, Gao Jianfeng. Arxiv 2020

[Paper] [Code]    
Few Shot Fine Tuning Has Code Model Architecture Pretraining Methods Reinforcement Learning Training Techniques Transformer

We present a new method SOLOIST that uses transfer learning and machine teaching to build task bots at scale. We parameterize classical modular task-oriented dialog systems using a Transformer-based auto-regressive language model, which subsumes different dialog modules into a single neural model. We pre-train, on heterogeneous dialog corpora, a task-grounded response generation model, which can generate dialog responses grounded in user goals and real-world knowledge for task completion. The pre-trained model can be efficiently adapted to accomplish new tasks with a handful of task-specific dialogs via machine teaching, where training samples are generated by human teachers interacting with the system. Experiments show that (i) SOLOIST creates new state-of-the-art on well-studied task-oriented dialog benchmarks, including CamRest676 and MultiWOZ; (ii) in the few-shot fine-tuning settings, SOLOIST significantly outperforms existing methods, and (iii) the use of machine teaching substantially reduces the labeling cost of fine-tuning. The pre-trained models and codes are available at https://aka.ms/soloist.

Similar Work