Syntax-driven Iterative Expansion Language Models For Controllable Text Generation · The Large Language Model Bible Contribute to LLM-Bible

Syntax-driven Iterative Expansion Language Models For Controllable Text Generation

Casas Noe, Fonollosa José A. R., Costa-jussà Marta R.. Arxiv 2020

[Paper]    
Applications Ethics And Bias Language Modeling Model Architecture Pretraining Methods Transformer

The dominant language modeling paradigm handles text as a sequence of discrete tokens. While that approach can capture the latent structure of the text, it is inherently constrained to sequential dynamics for text generation. We propose a new paradigm for introducing a syntactic inductive bias into neural text generation, where the dependency parse tree is used to drive the Transformer model to generate sentences iteratively. Our experiments show that this paradigm is effective at text generation, with quality between LSTMs and Transformers, and comparable diversity, requiring less than half their decoding steps, and its generation process allows direct control over the syntactic constructions of the generated text, enabling the induction of stylistic variations.

Similar Work