A Unified Query-based Generative Model For Question Generation And Question Answering · The Large Language Model Bible Contribute to LLM-Bible

A Unified Query-based Generative Model For Question Generation And Question Answering

Song Linfeng, Wang Zhiguo, Hamza Wael. Arxiv 2017

[Paper]    
Applications Attention Mechanism Ethics And Bias Model Architecture RAG Tools Training Techniques

We propose a query-based generative model for solving both tasks of question generation (QG) and question an- swering (QA). The model follows the classic encoder- decoder framework. The encoder takes a passage and a query as input then performs query understanding by matching the query with the passage from multiple per- spectives. The decoder is an attention-based Long Short Term Memory (LSTM) model with copy and coverage mechanisms. In the QG task, a question is generated from the system given the passage and the target answer, whereas in the QA task, the answer is generated given the question and the passage. During the training stage, we leverage a policy-gradient reinforcement learning algorithm to overcome exposure bias, a major prob- lem resulted from sequence learning with cross-entropy loss. For the QG task, our experiments show higher per- formances than the state-of-the-art results. When used as additional training data, the automatically generated questions even improve the performance of a strong ex- tractive QA system. In addition, our model shows bet- ter performance than the state-of-the-art baselines of the generative QA task.

Similar Work