Lightmbert: A Simple Yet Effective Method For Multilingual BERT Distillation · The Large Language Model Bible Contribute to LLM-Bible

Lightmbert: A Simple Yet Effective Method For Multilingual BERT Distillation

Jiao Xiaoqi, Yin Yichun, Shang Lifeng, Jiang Xin, Chen Xiao, Li Linlin, Wang Fang, Liu Qun. Arxiv 2021

[Paper]    
Applications BERT Distillation Efficiency And Optimization Model Architecture

The multilingual pre-trained language models (e.g, mBERT, XLM and XLM-R) have shown impressive performance on cross-lingual natural language understanding tasks. However, these models are computationally intensive and difficult to be deployed on resource-restricted devices. In this paper, we propose a simple yet effective distillation method (LightMBERT) for transferring the cross-lingual generalization ability of the multilingual BERT to a small student model. The experiment results empirically demonstrate the efficiency and effectiveness of LightMBERT, which is significantly better than the baselines and performs comparable to the teacher mBERT.

Similar Work