Coggpt: Unleashing The Power Of Cognitive Dynamics On Large Language Models · The Large Language Model Bible Contribute to LLM-Bible

Coggpt: Unleashing The Power Of Cognitive Dynamics On Large Language Models

Lv Yaojia, Pan Haojie, Fu Ruiji, Liu Ming, Wang Zhongyuan, Qin Bing. Arxiv 2024

[Paper]    
GPT Model Architecture Reinforcement Learning Survey Paper

Cognitive dynamics are pivotal to advance human understanding of the world. Recent advancements in large language models (LLMs) reveal their potential for cognitive simulation. However, these LLM-based cognitive studies primarily focus on static modeling, overlooking the dynamic nature of cognition. To bridge this gap, we propose the concept of the cognitive dynamics of LLMs and present a corresponding task with the inspiration of longitudinal studies. Towards the task, we develop CogBench, a novel benchmark to assess the cognitive dynamics of LLMs and validate it through participant surveys. We also design two evaluation metrics for CogBench, including Authenticity and Rationality. Recognizing the inherent static nature of LLMs, we introduce CogGPT for the task, which features an innovative iterative cognitive mechanism aimed at enhancing lifelong cognitive dynamics. Empirical results demonstrate the superiority of CogGPT over existing methods, particularly in its ability to facilitate role-specific cognitive dynamics under continuous information flows.

Similar Work