Explicit Syntactic Guidance For Neural Text Generation · The Large Language Model Bible Contribute to LLM-Bible

Explicit Syntactic Guidance For Neural Text Generation

Li Yafu, Cui Leyang, Yan Jianhao, Yin Yongjing, Bi Wei, Shi Shuming, Zhang Yue. Arxiv 2023

[Paper]    
Applications GPT Interpretability And Explainability Language Modeling Pretraining Methods

Most existing text generation models follow the sequence-to-sequence paradigm. Generative Grammar suggests that humans generate natural language texts by learning language grammar. We propose a syntax-guided generation schema, which generates the sequence guided by a constituency parse tree in a top-down direction. The decoding process can be decomposed into two parts: (1) predicting the infilling texts for each constituent in the lexicalized syntax context given the source sentence; (2) mapping and expanding each constituent to construct the next-level syntax context. Accordingly, we propose a structural beam search method to find possible syntax structures hierarchically. Experiments on paraphrase generation and machine translation show that the proposed method outperforms autoregressive baselines, while also demonstrating effectiveness in terms of interpretability, controllability, and diversity.

Similar Work