BOLT: Fast Energy-based Controlled Text Generation With Tunable Biases · The Large Language Model Bible Contribute to LLM-Bible

BOLT: Fast Energy-based Controlled Text Generation With Tunable Biases

Liu Xin, Khalifa Muhammad, Wang Lu. Arxiv 2023

[Paper]    
Applications Efficiency And Optimization Ethics And Bias GPT Language Modeling Pretraining Methods Reinforcement Learning

Energy-based models (EBMs) have gained popularity for controlled text generation due to their high applicability to a wide range of constraints. However, sampling from EBMs is non-trivial, as it often requires a large number of iterations to converge to plausible text, which slows down the decoding process and makes it less practical for real-world applications. In this work, we propose BOLT, which relies on tunable biases to directly adjust the language model’s output logits. Unlike prior work, BOLT maintains the generator’s autoregressive nature to assert a strong control on token-wise conditional dependencies and overall fluency, and thus converges faster. When compared with state-of-the-arts on controlled generation tasks using both soft constraints (e.g., sentiment control) and hard constraints (e.g., keyword-guided topic control), BOLT demonstrates significantly improved efficiency and fluency. On sentiment control, BOLT is 7x faster than competitive baselines, and more fluent in 74.4% of the evaluation samples according to human judges.

Similar Work