Augmenting Large Language Model Translators Via Translation Memories · The Large Language Model Bible Contribute to LLM-Bible

Augmenting Large Language Model Translators Via Translation Memories

Mu Yongyu, Reheman Abudurexiti, Cao Zhiquan, Fan Yuchun, Li Bei, Li Yinqiao, Xiao Tong, Zhang Chunliang, Zhu Jingbo. Arxiv 2023

[Paper]    
Applications In Context Learning Prompting

Using translation memories (TMs) as prompts is a promising approach to in-context learning of machine translation models. In this work, we take a step towards prompting large language models (LLMs) with TMs and making them better translators. We find that the ability of LLMs to ``understand’’ prompts is indeed helpful for making better use of TMs. Experiments show that the results of a pre-trained LLM translator can be greatly improved by using high-quality TM-based prompts. These results are even comparable to those of the state-of-the-art NMT systems which have access to large-scale in-domain bilingual data and are well tuned on the downstream tasks.

Similar Work