PAIR: Planning And Iterative Refinement In Pre-trained Transformers For Long Text Generation · The Large Language Model Bible Contribute to LLM-Bible

PAIR: Planning And Iterative Refinement In Pre-trained Transformers For Long Text Generation

Hua Xinyu, Wang Lu. Arxiv 2020

[Paper]    
Applications BERT Language Modeling Model Architecture Pretraining Methods RAG Tools Transformer

Pre-trained Transformers have enabled impressive breakthroughs in generating long and fluent text, yet their outputs are often “rambling” without coherently arranged content. In this work, we present a novel content-controlled text generation framework, PAIR, with planning and iterative refinement, which is built upon a large model, BART. We first adapt the BERT model to automatically construct the content plans, consisting of keyphrase assignments and their corresponding sentence-level positions. The BART model is employed for generation without modifying its structure. We then propose a refinement algorithm to gradually enhance the generation quality within the sequence-to-sequence framework. Evaluation with automatic metrics shows that adding planning consistently improves the generation quality on three distinct domains, with an average of 20 BLEU points and 12 METEOR points improvements. In addition, human judges rate our system outputs to be more relevant and coherent than comparisons without planning.

Similar Work