Summer: Wechat Neural Machine Translation Systems For The WMT22 Biomedical Translation Task · The Large Language Model Bible Contribute to LLM-Bible

Summer: Wechat Neural Machine Translation Systems For The WMT22 Biomedical Translation Task

Li Ernan, Meng Fandong, Zhou Jie. Arxiv 2022

[Paper]    
Applications Fine Tuning Model Architecture Pretraining Methods Training Techniques Transformer

This paper introduces WeChat’s participation in WMT 2022 shared biomedical translation task on Chinese to English. Our systems are based on the Transformer, and use several different Transformer structures to improve the quality of translation. In our experiments, we employ data filtering, data generation, several variants of Transformer, fine-tuning and model ensemble. Our Chinese\(\to\)English system, named Summer, achieves the highest BLEU score among all submissions.

Similar Work