IT5: Text-to-text Pretraining For Italian Language Understanding And Generation · The Large Language Model Bible Contribute to LLM-Bible

IT5: Text-to-text Pretraining For Italian Language Understanding And Generation

Sarti Gabriele, Nissim Malvina. Proceedings of LREC-COLING 2022

[Paper]    
Applications Model Architecture Pretraining Methods Training Techniques Transformer

We introduce IT5, the first family of encoder-decoder transformer models pretrained specifically on Italian. We document and perform a thorough cleaning procedure for a large Italian corpus and use it to pretrain four IT5 model sizes. We then introduce the ItaGen benchmark, which includes a broad range of natural language understanding and generation tasks for Italian, and use it to evaluate the performance of IT5 models and multilingual baselines. We find monolingual IT5 models to provide the best scale-to-performance ratio across tested models, consistently outperforming their multilingual counterparts and setting a new state-of-the-art for Italian language generation.

Similar Work