Word2world: Generating Stories And Worlds Through Large Language Models · The Large Language Model Bible Contribute to LLM-Bible

Word2world: Generating Stories And Worlds Through Large Language Models

Nasir Muhammad U., James Steven, Togelius Julian. Arxiv 2024

[Paper] [Code]    
Fine Tuning Has Code Pretraining Methods RAG Reinforcement Learning Training Techniques

Large Language Models (LLMs) have proven their worth across a diverse spectrum of disciplines. LLMs have shown great potential in Procedural Content Generation (PCG) as well, but directly generating a level through a pre-trained LLM is still challenging. This work introduces Word2World, a system that enables LLMs to procedurally design playable games through stories, without any task-specific fine-tuning. Word2World leverages the abilities of LLMs to create diverse content and extract information. Combining these abilities, LLMs can create a story for the game, design narrative, and place tiles in appropriate places to create coherent worlds and playable games. We test Word2World with different LLMs and perform a thorough ablation study to validate each step. We open-source the code at https://github.com/umair-nasir14/Word2World.

Similar Work