Ms-uedin Submission To The WMT2018 APE Shared Task: Dual-source Transformer For Automatic Post-editing · The Large Language Model Bible Contribute to LLM-Bible

Ms-uedin Submission To The WMT2018 APE Shared Task: Dual-source Transformer For Automatic Post-editing

Junczys-dowmunt Marcin, Grundkiewicz Roman. Arxiv 2018

[Paper]    
Model Architecture Pretraining Methods Training Techniques Transformer

This paper describes the Microsoft and University of Edinburgh submission to the Automatic Post-editing shared task at WMT2018. Based on training data and systems from the WMT2017 shared task, we re-implement our own models from the last shared task and introduce improvements based on extensive parameter sharing. Next we experiment with our implementation of dual-source transformer models and data selection for the IT domain. Our submissions decisively wins the SMT post-editing sub-task establishing the new state-of-the-art and is a very close second (or equal, 16.46 vs 16.50 TER) in the NMT sub-task. Based on the rather weak results in the NMT sub-task, we hypothesize that neural-on-neural APE might not be actually useful.

Similar Work