Prompt Optimizer Of Text-to-image Diffusion Models For Abstract Concept Understanding · The Large Language Model Bible Contribute to LLM-Bible

Prompt Optimizer Of Text-to-image Diffusion Models For Abstract Concept Understanding

Fan Zezhong, Li Xiaohan, Fang Chenhao, Biswas Topojoy, Nag Kaushiki, Xu Jianpeng, Achan Kannan. Arxiv 2024

[Paper]    
Agentic Efficiency And Optimization GPT Merging Model Architecture Prompting Reinforcement Learning Tools

The rapid evolution of text-to-image diffusion models has opened the door of generative AI, enabling the translation of textual descriptions into visually compelling images with remarkable quality. However, a persistent challenge within this domain is the optimization of prompts to effectively convey abstract concepts into concrete objects. For example, text encoders can hardly express “peace”, while can easily illustrate olive branches and white doves. This paper introduces a novel approach named Prompt Optimizer for Abstract Concepts (POAC) specifically designed to enhance the performance of text-to-image diffusion models in interpreting and generating images from abstract concepts. We propose a Prompt Language Model (PLM), which is initialized from a pre-trained language model, and then fine-tuned with a curated dataset of abstract concept prompts. The dataset is created with GPT-4 to extend the abstract concept to a scene and concrete objects. Our framework employs a Reinforcement Learning (RL)-based optimization strategy, focusing on the alignment between the generated images by a stable diffusion model and optimized prompts. Through extensive experiments, we demonstrate that our proposed POAC significantly improves the accuracy and aesthetic quality of generated images, particularly in the description of abstract concepts and alignment with optimized prompts. We also present a comprehensive analysis of our model’s performance across diffusion models under different settings, showcasing its versatility and effectiveness in enhancing abstract concept representation.

Similar Work