Data Processing Matters: Srph-konvergen Ai's Machine Translation System For WMT'21 · The Large Language Model Bible Contribute to LLM-Bible

Data Processing Matters: Srph-konvergen Ai's Machine Translation System For WMT'21

Sutawika Lintang, Cruz Jan Christian Blaise. Arxiv 2021

[Paper]    
Applications Model Architecture Pretraining Methods RAG Training Techniques Transformer

In this paper, we describe the submission of the joint Samsung Research Philippines-Konvergen AI team for the WMT’21 Large Scale Multilingual Translation Task - Small Track 2. We submit a standard Seq2Seq Transformer model to the shared task without any training or architecture tricks, relying mainly on the strength of our data preprocessing techniques to boost performance. Our final submission model scored 22.92 average BLEU on the FLORES-101 devtest set, and scored 22.97 average BLEU on the contest’s hidden test set, ranking us sixth overall. Despite using only a standard Transformer, our model ranked first in Indonesian to Javanese, showing that data preprocessing matters equally, if not more, than cutting edge model architectures and training techniques.

Similar Work