Prompted Llms As Chatbot Modules For Long Open-domain Conversation · The Large Language Model Bible Contribute to LLM-Bible

Prompted Llms As Chatbot Modules For Long Open-domain Conversation

Lee Gibbeum, Hartmann Volker, Park Jongho, Papailiopoulos Dimitris, Lee Kangwook. Arxiv 2023

[Paper]    
Agentic Few Shot Fine Tuning In Context Learning Pretraining Methods Prompting Training Techniques

In this paper, we propose MPC (Modular Prompted Chatbot), a new approach for creating high-quality conversational agents without the need for fine-tuning. Our method utilizes pre-trained large language models (LLMs) as individual modules for long-term consistency and flexibility, by using techniques such as few-shot prompting, chain-of-thought (CoT), and external memory. Our human evaluation results show that MPC is on par with fine-tuned chatbot models in open-domain conversations, making it an effective solution for creating consistent and engaging chatbots.

Similar Work