Duetrag: Collaborative Retrieval-augmented Generation · The Large Language Model Bible Contribute to LLM-Bible

Duetrag: Collaborative Retrieval-augmented Generation

Jiao Dian, Cai Li, Huang Jingsheng, Zhang Wenqiao, Tang Siliang, Zhuang Yueting. Arxiv 2024

[Paper]    
RAG Tools

Retrieval-Augmented Generation (RAG) methods augment the input of Large Language Models (LLMs) with relevant retrieved passages, reducing factual errors in knowledge-intensive tasks. However, contemporary RAG approaches suffer from irrelevant knowledge retrieval issues in complex domain questions (e.g., HotPot QA) due to the lack of corresponding domain knowledge, leading to low-quality generations. To address this issue, we propose a novel Collaborative Retrieval-Augmented Generation framework, DuetRAG. Our bootstrapping philosophy is to simultaneously integrate the domain fintuning and RAG models to improve the knowledge retrieval quality, thereby enhancing generation quality. Finally, we demonstrate DuetRAG’ s matches with expert human researchers on HotPot QA.

Similar Work