Cascading Adaptors To Leverage English Data To Improve Performance Of Question Answering For Low-resource Languages · The Large Language Model Bible Contribute to LLM-Bible

Cascading Adaptors To Leverage English Data To Improve Performance Of Question Answering For Low-resource Languages

Pandya Hariom A., Ardeshna Bhavik, Bhatt Brijesh S.. Arxiv 2021

[Paper]    
Applications Model Architecture Pretraining Methods RAG Transformer

Transformer based architectures have shown notable results on many down streaming tasks including question answering. The availability of data, on the other hand, impedes obtaining legitimate performance for low-resource languages. In this paper, we investigate the applicability of pre-trained multilingual models to improve the performance of question answering in low-resource languages. We tested four combinations of language and task adapters using multilingual transformer architectures on seven languages similar to MLQA dataset. Additionally, we have also proposed zero-shot transfer learning of low-resource question answering using language and task adapters. We observed that stacking the language and the task adapters improves the multilingual transformer models’ performance significantly for low-resource languages.

Similar Work