A Family Of Pretrained Transformer Language Models For Russian · The Large Language Model Bible Contribute to LLM-Bible

A Family Of Pretrained Transformer Language Models For Russian

Zmitrovich Dmitry, Abramov Alexander, Kalmykov Andrey, Tikhonova Maria, Taktasheva Ekaterina, Astafurov Danil, Baushenko Mark, Snegirev Artem, Kadulin Vitalii, Markov Sergey, Shavrina Tatiana, Mikhailov Vladislav, Fenogenova Alena. https://aclanthology.org/ 2023

[Paper]    
Applications Attention Mechanism BERT GPT Model Architecture Pretraining Methods Training Techniques Transformer

Transformer language models (LMs) are fundamental to NLP research methodologies and applications in various languages. However, developing such models specifically for the Russian language has received little attention. This paper introduces a collection of 13 Russian Transformer LMs, which spans encoder (ruBERT, ruRoBERTa, ruELECTRA), decoder (ruGPT-3), and encoder-decoder (ruT5, FRED-T5) architectures. We provide a report on the model architecture design and pretraining, and the results of evaluating their generalization abilities on Russian language understanding and generation datasets and benchmarks. By pretraining and releasing these specialized Transformer LMs, we aim to broaden the scope of the NLP research directions and enable the development of industrial solutions for the Russian language.

Similar Work