Silo Nlp's Participation At WAT2022 · The Large Language Model Bible Contribute to LLM-Bible

Silo Nlp's Participation At WAT2022

Parida Shantipriya, Panda Subhadarshi, Grönroos Stig-arne, Granroth-wilding Mark, Koistinen Mika. Arxiv 2022

[Paper]    
Model Architecture Multimodal Models Pretraining Methods Transformer

This paper provides the system description of “Silo NLP’s” submission to the Workshop on Asian Translation (WAT2022). We have participated in the Indic Multimodal tasks (English->Hindi, English->Malayalam, and English->Bengali Multimodal Translation). For text-only translation, we trained Transformers from scratch and fine-tuned mBART-50 models. For multimodal translation, we used the same mBART architecture and extracted object tags from the images to use as visual features concatenated with the text sequence. Our submission tops many tasks including English->Hindi multimodal translation (evaluation test), English->Malayalam text-only and multimodal translation (evaluation test), English->Bengali multimodal translation (challenge test), and English->Bengali text-only translation (evaluation test).

Similar Work