Dialokg: Knowledge-structure Aware Task-oriented Dialogue Generation · The Large Language Model Bible Contribute to LLM-Bible

Dialokg: Knowledge-structure Aware Task-oriented Dialogue Generation

Rony Md Rashad Al Hasan, Usbeck Ricardo, Lehmann Jens. Arxiv 2022

[Paper]    
Applications Attention Mechanism Distillation Efficiency And Optimization Model Architecture Reinforcement Learning

Task-oriented dialogue generation is challenging since the underlying knowledge is often dynamic and effectively incorporating knowledge into the learning process is hard. It is particularly challenging to generate both human-like and informative responses in this setting. Recent research primarily focused on various knowledge distillation methods where the underlying relationship between the facts in a knowledge base is not effectively captured. In this paper, we go one step further and demonstrate how the structural information of a knowledge graph can improve the system’s inference capabilities. Specifically, we propose DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model. Our proposed system views relational knowledge as a knowledge graph and introduces (1) a structure-aware knowledge embedding technique, and (2) a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation. An empirical evaluation demonstrates the effectiveness of DialoKG over state-of-the-art methods on several standard benchmark datasets.

Similar Work