Temporal Attention Model For Neural Machine Translation · The Large Language Model Bible Contribute to LLM-Bible

Temporal Attention Model For Neural Machine Translation

Sankaran Baskaran, Mi Haitao, Al-onaizan Yaser, Ittycheriah Abe. Arxiv 2016

[Paper]    
Applications Attention Mechanism Merging Model Architecture

Attention-based Neural Machine Translation (NMT) models suffer from attention deficiency issues as has been observed in recent research. We propose a novel mechanism to address some of these limitations and improve the NMT attention. Specifically, our approach memorizes the alignments temporally (within each sentence) and modulates the attention with the accumulated temporal memory, as the decoder generates the candidate translation. We compare our approach against the baseline NMT model and two other related approaches that address this issue either explicitly or implicitly. Large-scale experiments on two language pairs show that our approach achieves better and robust gains over the baseline and related NMT approaches. Our model further outperforms strong SMT baselines in some settings even without using ensembles.

Similar Work