Robust Conversational AI With Grounded Text Generation · The Large Language Model Bible Contribute to LLM-Bible

Robust Conversational AI With Grounded Text Generation

Gao Jianfeng, Peng Baolin, Li Chunyuan, Li Jinchao, Shayandeh Shahin, Liden Lars, Shum Heung-yeung. Arxiv 2020

[Paper]    
Applications Language Modeling Model Architecture Pretraining Methods Reinforcement Learning Transformer

This article presents a hybrid approach based on a Grounded Text Generation (GTG) model to building robust task bots at scale. GTG is a hybrid model which uses a large-scale Transformer neural network as its backbone, combined with symbol-manipulation modules for knowledge base inference and prior knowledge encoding, to generate responses grounded in dialog belief state and real-world knowledge for task completion. GTG is pre-trained on large amounts of raw text and human conversational data, and can be fine-tuned to complete a wide range of tasks. The hybrid approach and its variants are being developed simultaneously by multiple research teams. The primary results reported on task-oriented dialog benchmarks are very promising, demonstrating the big potential of this approach. This article provides an overview of this progress and discusses related methods and technologies that can be incorporated for building robust conversational AI systems.

Similar Work