Chunk-based Bi-scale Decoder For Neural Machine Translation · The Large Language Model Bible Contribute to LLM-Bible

Chunk-based Bi-scale Decoder For Neural Machine Translation

Zhou Hao, Tu Zhaopeng, Huang Shujian, Liu Xiaohua, Li Hang, Chen Jiajun. Arxiv 2017

[Paper]    
Applications RAG

In typical neural machine translation~(NMT), the decoder generates a sentence word by word, packing all linguistic granularities in the same time-scale of RNN. In this paper, we propose a new type of decoder for NMT, which splits the decode state into two parts and updates them in two different time-scales. Specifically, we first predict a chunk time-scale state for phrasal modeling, on top of which multiple word time-scale states are generated. In this way, the target sentence is translated hierarchically from chunks to words, with information in different granularities being leveraged. Experiments show that our proposed model significantly improves the translation performance over the state-of-the-art NMT model.

Similar Work