Domain-adaptive Pretraining Methods For Dialogue Understanding · The Large Language Model Bible Contribute to LLM-Bible

Domain-adaptive Pretraining Methods For Dialogue Understanding

Wu Han, Xu Kun, Song Linfeng, Jin Lifeng, Zhang Haisong, Song Linqi. Arxiv 2021

[Paper]    
BERT Model Architecture Pretraining Methods Training Techniques

Language models like BERT and SpanBERT pretrained on open-domain data have obtained impressive gains on various NLP tasks. In this paper, we probe the effectiveness of domain-adaptive pretraining objectives on downstream tasks. In particular, three objectives, including a novel objective focusing on modeling predicate-argument relations, are evaluated on two challenging dialogue understanding tasks. Experimental results demonstrate that domain-adaptive pretraining with proper objectives can significantly improve the performance of a strong baseline on these tasks, achieving the new state-of-the-art performances.

Similar Work