Big Bidirectional Insertion Representations For Documents · The Large Language Model Bible Contribute to LLM-Bible

Big Bidirectional Insertion Representations For Documents

Li Lala, Chan William. Arxiv 2019

[Paper]    
Applications Attention Mechanism Language Modeling Model Architecture Pretraining Methods Transformer

The Insertion Transformer is well suited for long form text generation due to its parallel generation capabilities, requiring \(O(log_2 n)\) generation steps to generate \(n\) tokens. However, modeling long sequences is difficult, as there is more ambiguity captured in the attention mechanism. This work proposes the Big Bidirectional Insertion Representations for Documents (Big BIRD), an insertion-based model for document-level translation tasks. We scale up the insertion-based models to long form documents. Our key contribution is introducing sentence alignment via sentence-positional embeddings between the source and target document. We show an improvement of +4.3 BLEU on the WMT’19 English\(\rightarrow\)German document-level translation task compared with the Insertion Transformer baseline.

Similar Work