PERSOMA: Personalized Soft Prompt Adapter Architecture For Personalized Language Prompting · The Large Language Model Bible Contribute to LLM-Bible

PERSOMA: Personalized Soft Prompt Adapter Architecture For Personalized Language Prompting

Hebert Liam, Sayana Krishna, Jash Ambarish, Karatzoglou Alexandros, Sodhi Sukhdeep, Doddapaneni Sumanth, Cai Yanli, Kuzmin Dima. Arxiv 2024

[Paper]    
Fine Tuning Model Architecture Prompting

Understanding the nuances of a user’s extensive interaction history is key to building accurate and personalized natural language systems that can adapt to evolving user preferences. To address this, we introduce PERSOMA, Personalized Soft Prompt Adapter architecture. Unlike previous personalized prompting methods for large language models, PERSOMA offers a novel approach to efficiently capture user history. It achieves this by resampling and compressing interactions as free form text into expressive soft prompt embeddings, building upon recent research utilizing embedding representations as input for LLMs. We rigorously validate our approach by evaluating various adapter architectures, first-stage sampling strategies, parameter-efficient tuning techniques like LoRA, and other personalization methods. Our results demonstrate PERSOMA’s superior ability to handle large and complex user histories compared to existing embedding-based and text-prompt-based techniques.

Similar Work