Think-on-graph 2.0: Deep And Interpretable Large Language Model Reasoning With Knowledge Graph-guided Retrieval · The Large Language Model Bible Contribute to LLM-Bible

Think-on-graph 2.0: Deep And Interpretable Large Language Model Reasoning With Knowledge Graph-guided Retrieval

Ma Shengjie, Xu Chengjin, Jiang Xuhui, Li Muzhi, Qu Huaren, Guo Jian. Arxiv 2024

[Paper]    
Applications RAG Tools

Retrieval-augmented generation (RAG) has significantly advanced large language models (LLMs) by enabling dynamic information retrieval to mitigate knowledge gaps and hallucinations in generated content. However, these systems often falter with complex reasoning and consistency across diverse queries. In this work, we present Think-on-Graph 2.0, an enhanced RAG framework that aligns questions with the knowledge graph and uses it as a navigational tool, which deepens and refines the RAG paradigm for information collection and integration. The KG-guided navigation fosters deep and long-range associations to uphold logical consistency and optimize the scope of retrieval for precision and interoperability. In conjunction, factual consistency can be better ensured through semantic similarity guided by precise directives. ToG\({2.0}\) not only improves the accuracy and reliability of LLMs’ responses but also demonstrates the potential of hybrid structured knowledge systems to significantly advance LLM reasoning, aligning it closer to human-like performance. We conducted extensive experiments on four public datasets to demonstrate the advantages of our method compared to the baseline.

Similar Work