Language Model Sentence Completion With A Parser-driven Rhetorical Control Method · The Large Language Model Bible Contribute to LLM-Bible

Language Model Sentence Completion With A Parser-driven Rhetorical Control Method

Zingale Joshua, Kalita Jugal. Arxiv 2024

[Paper]    
Applications Fine Tuning Language Modeling Pretraining Methods Training Techniques

Controlled text generation (CTG) seeks to guide large language model (LLM) output to produce text that conforms to desired criteria. The current study presents a novel CTG algorithm that enforces adherence toward specific rhetorical relations in an LLM sentence-completion context by a parser-driven decoding scheme that requires no model fine-tuning. The method is validated both with automatic and human evaluation. The code is accessible on GitHub.

Similar Work