KG-BART: Knowledge Graph-augmented BART For Generative Commonsense Reasoning · The Large Language Model Bible Contribute to LLM-Bible

KG-BART: Knowledge Graph-augmented BART For Generative Commonsense Reasoning

Liu Ye, Wan Yao, He Lifang, Peng Hao, Yu Philip S.. Arxiv 2020

[Paper]    
Applications Attention Mechanism Language Modeling Model Architecture RAG Reinforcement Learning

Generative commonsense reasoning which aims to empower machines to generate sentences with the capacity of reasoning over a set of concepts is a critical bottleneck for text generation. Even the state-of-the-art pre-trained language generation models struggle at this task and often produce implausible and anomalous sentences. One reason is that they rarely consider incorporating the knowledge graph which can provide rich relational information among the commonsense concepts. To promote the ability of commonsense reasoning for text generation, we propose a novel knowledge graph augmented pre-trained language generation model KG-BART, which encompasses the complex relations of concepts through the knowledge graph and produces more logical and natural sentences as output. Moreover, KG-BART can leverage the graph attention to aggregate the rich concept semantics that enhances the model generalization on unseen concept sets. Experiments on benchmark CommonGen dataset verify the effectiveness of our proposed approach by comparing with several strong pre-trained language generation models, particularly KG-BART outperforms BART by 5.80, 4.60, in terms of BLEU-3, 4. Moreover, we also show that the generated context by our model can work as background scenarios to benefit downstream commonsense QA tasks.

Similar Work