Attentive Fine-tuning Of Transformers For Translation Of Low-resourced Languages @loresmt 2021 · The Large Language Model Bible Contribute to LLM-Bible

Attentive Fine-tuning Of Transformers For Translation Of Low-resourced Languages @loresmt 2021

Puranik Karthik, Hande Adeep, Priyadharshini Ruba, Durairaj Thenmozhi, Sampath Anbukkarasi, Thamburaj Kingston Pal, Chakravarthi Bharathi Raja. Arxiv 2021

[Paper]    
Applications Fine Tuning Model Architecture Pretraining Methods Training Techniques Transformer

This paper reports the Machine Translation (MT) systems submitted by the IIITT team for the English->Marathi and English->Irish language pairs LoResMT 2021 shared task. The task focuses on getting exceptional translations for rather low-resourced languages like Irish and Marathi. We fine-tune IndicTrans, a pretrained multilingual NMT model for English->Marathi, using external parallel corpus as input for additional training. We have used a pretrained Helsinki-NLP Opus MT English->Irish model for the latter language pair. Our approaches yield relatively promising results on the BLEU metrics. Under the team name IIITT, our systems ranked 1, 1, and 2 in English->Marathi, Irish->English, and English->Irish, respectively.

Similar Work