POMP: Probability-driven Meta-graph Prompter For Llms In Low-resource Unsupervised Neural Machine Translation · The Large Language Model Bible Contribute to LLM-Bible

POMP: Probability-driven Meta-graph Prompter For Llms In Low-resource Unsupervised Neural Machine Translation

Pan Shilong, Tian Zhiliang, Ding Liang, Huang Zhen, Wen Zhihua, Li Dongsheng. Arxiv 2024

[Paper]    
Applications Ethics And Bias Fine Tuning In Context Learning Pretraining Methods Prompting Reinforcement Learning Training Techniques

Low-resource languages (LRLs) face challenges in supervised neural machine translation due to limited parallel data, prompting research into unsupervised methods. Unsupervised neural machine translation (UNMT) methods, including back-translation, transfer learning, and pivot-based translation, offer practical solutions for LRL translation, but they are hindered by issues like synthetic data noise, language bias, and error propagation, which can potentially be mitigated by Large Language Models (LLMs). LLMs have advanced NMT with in-context learning (ICL) and supervised fine-tuning methods, but insufficient training data results in poor performance in LRLs. We argue that LLMs can mitigate the linguistic noise with auxiliary languages to improve translations in LRLs. In this paper, we propose Probability-driven Meta-graph Prompter (POMP), a novel approach employing a dynamic, sampling-based graph of multiple auxiliary languages to enhance LLMs’ translation capabilities for LRLs. POMP involves constructing a directed acyclic meta-graph for each source language, from which we dynamically sample multiple paths to prompt LLMs to mitigate the linguistic noise and improve translations during training. We use the BLEURT metric to evaluate the translations and back-propagate rewards, estimated by scores, to update the probabilities of auxiliary languages in the paths. Our experiments show significant improvements in the translation quality of three LRLs, demonstrating the effectiveness of our approach.

Similar Work