Wechat Neural Machine Translation Systems For WMT20 · The Large Language Model Bible Contribute to LLM-Bible

Wechat Neural Machine Translation Systems For WMT20

Meng Fandong, Yan Jianhao, Liu Yijin, Gao Yuan, Zeng Xianfeng, Zeng Qinsong, Li Peng, Chen Ming, Zhou Jie, Liu Sifan, Zhou Hao. Arxiv 2020

[Paper]    
Applications Distillation Efficiency And Optimization Model Architecture Pretraining Methods Transformer

We participate in the WMT 2020 shared news translation task on Chinese to English. Our system is based on the Transformer (Vaswani et al., 2017a) with effective variants and the DTMT (Meng and Zhang, 2019) architecture. In our experiments, we employ data selection, several synthetic data generation approaches (i.e., back-translation, knowledge distillation, and iterative in-domain knowledge transfer), advanced finetuning approaches and self-bleu based model ensemble. Our constrained Chinese to English system achieves 36.9 case-sensitive BLEU score, which is the highest among all submissions.

Similar Work