Kuaiji: The First Chinese Accounting Large Language Model · The Large Language Model Bible Contribute to LLM-Bible

Kuaiji: The First Chinese Accounting Large Language Model

Luo Jiayuan, Yang Songhua, Qiu Xiaoling, Chen Panyu, Nai Yufei, Zeng Wenxuan, Zhang Wentao, Jiang Xinke. Arxiv 2024

[Paper]    
Fine Tuning GPT Model Architecture Pretraining Methods Reinforcement Learning Tools Training Techniques

Large Language Models (LLMs) like ChatGPT and GPT-4 have demonstrated impressive proficiency in comprehending and generating natural language. However, they encounter difficulties when tasked with adapting to specialized domains such as accounting. To address this challenge, we introduce Kuaiji, a tailored Accounting Large Language Model. Kuaiji is meticulously fine-tuned using the Baichuan framework, which encompasses continuous pre-training and supervised fine-tuning processes. Supported by CAtAcctQA, a dataset containing large genuine accountant-client dialogues, Kuaiji exhibits exceptional accuracy and response speed. Our contributions encompass the creation of the first Chinese accounting dataset, the establishment of Kuaiji as a leading open-source Chinese accounting LLM, and the validation of its efficacy through real-world accounting scenarios.

Similar Work