Medical Dialogue Response Generation With Pivotal Information Recalling · The Large Language Model Bible Contribute to LLM-Bible

Medical Dialogue Response Generation With Pivotal Information Recalling

Zhao Yu, Li Yunxin, Wu Yuxiang, Hu Baotian, Chen Qingcai, Wang Xiaolong, Ding Yuxin, Zhang Min. Arxiv 2022

[Paper]    
Attention Mechanism Model Architecture Transformer

Medical dialogue generation is an important yet challenging task. Most previous works rely on the attention mechanism and large-scale pretrained language models. However, these methods often fail to acquire pivotal information from the long dialogue history to yield an accurate and informative response, due to the fact that the medical entities usually scatters throughout multiple utterances along with the complex relationships between them. To mitigate this problem, we propose a medical response generation model with Pivotal Information Recalling (MedPIR), which is built on two components, i.e., knowledge-aware dialogue graph encoder and recall-enhanced generator. The knowledge-aware dialogue graph encoder constructs a dialogue graph by exploiting the knowledge relationships between entities in the utterances, and encodes it with a graph attention network. Then, the recall-enhanced generator strengthens the usage of these pivotal information by generating a summary of the dialogue before producing the actual response. Experimental results on two large-scale medical dialogue datasets show that MedPIR outperforms the strong baselines in BLEU scores and medical entities F1 measure.

Similar Work