The University Of Cambridge's Machine Translation Systems For WMT18 · The Large Language Model Bible Contribute to LLM-Bible

The University Of Cambridge's Machine Translation Systems For WMT18

Stahlberg Felix, De Gispert Adria, Byrne Bill. Arxiv 2018

[Paper]    
Applications Attention Mechanism Merging Model Architecture Pretraining Methods Transformer

The University of Cambridge submission to the WMT18 news translation task focuses on the combination of diverse models of translation. We compare recurrent, convolutional, and self-attention-based neural models on German-English, English-German, and Chinese-English. Our final system combines all neural models together with a phrase-based SMT system in an MBR-based scheme. We report small but consistent gains on top of strong Transformer ensembles.

Similar Work