Modern Methods For Text Generation · The Large Language Model Bible Contribute to LLM-Bible

Modern Methods For Text Generation

Montesinos Dimas Munoz. Arxiv 2020

[Paper]    
Applications BERT GPT Language Modeling Model Architecture Pretraining Methods Transformer

Synthetic text generation is challenging and has limited success. Recently, a new architecture, called Transformers, allow machine learning models to understand better sequential data, such as translation or summarization. BERT and GPT-2, using Transformers in their cores, have shown a great performance in tasks such as text classification, translation and NLI tasks. In this article, we analyse both algorithms and compare their output quality in text generation tasks.

Similar Work