From Classification To Generation: Insights Into Crosslingual Retrieval Augmented ICL · The Large Language Model Bible Contribute to LLM-Bible

From Classification To Generation: Insights Into Crosslingual Retrieval Augmented ICL

Li Xiaoqian, Nie Ercong, Liang Sheng. Arxiv 2023

[Paper]    
In Context Learning Prompting RAG

The remarkable ability of Large Language Models (LLMs) to understand and follow instructions has sometimes been limited by their in-context learning (ICL) performance in low-resource languages. To address this, we introduce a novel approach that leverages cross-lingual retrieval-augmented in-context learning (CREA-ICL). By extracting semantically similar prompts from high-resource languages, we aim to improve the zero-shot performance of multilingual pre-trained language models (MPLMs) across diverse tasks. Though our approach yields steady improvements in classification tasks, it faces challenges in generation tasks. Our evaluation offers insights into the performance dynamics of retrieval-augmented in-context learning across both classification and generation domains.

Similar Work