Neural Machine Translation With Contrastive Translation Memories · The Large Language Model Bible Contribute to LLM-Bible

Neural Machine Translation With Contrastive Translation Memories

Cheng Xin, Gao Shen, Liu Lemao, Zhao Dongyan, Yan Rui. Arxiv 2022

[Paper]    
Applications Attention Mechanism Model Architecture RAG Tools Training Techniques

Retrieval-augmented Neural Machine Translation models have been successful in many translation scenarios. Different from previous works that make use of mutually similar but redundant translation memories~(TMs), we propose a new retrieval-augmented NMT to model contrastively retrieved translation memories that are holistically similar to the source sentence while individually contrastive to each other providing maximal information gains in three phases. First, in TM retrieval phase, we adopt a contrastive retrieval algorithm to avoid redundancy and uninformativeness of similar translation pieces. Second, in memory encoding stage, given a set of TMs we propose a novel Hierarchical Group Attention module to gather both local context of each TM and global context of the whole TM set. Finally, in training phase, a Multi-TM contrastive learning objective is introduced to learn salient feature of each TM with respect to target sentence. Experimental results show that our framework obtains improvements over strong baselines on the benchmark datasets.

Similar Work