Checks And Strategies For Enabling Code-switched Machine Translation · The Large Language Model Bible Contribute to LLM-Bible

Checks And Strategies For Enabling Code-switched Machine Translation

Gowda Thamme, Gheini Mozhdeh, May Jonathan. Arxiv 2022

[Paper]    
Applications Attention Mechanism Model Architecture Reinforcement Learning Security

Code-switching is a common phenomenon among multilingual speakers, where alternation between two or more languages occurs within the context of a single conversation. While multilingual humans can seamlessly switch back and forth between languages, multilingual neural machine translation (NMT) models are not robust to such sudden changes in input. This work explores multilingual NMT models’ ability to handle code-switched text. First, we propose checks to measure switching capability. Second, we investigate simple and effective data augmentation methods that can enhance an NMT model’s ability to support code-switching. Finally, by using a glass-box analysis of attention modules, we demonstrate the effectiveness of these methods in improving robustness.

Similar Work