Grounding In Social Media: An Approach To Building A Chit-chat Dialogue Model · The Large Language Model Bible Contribute to LLM-Bible

Grounding In Social Media: An Approach To Building A Chit-chat Dialogue Model

Choudhary Ritvik, Kawahara Daisuke. Arxiv 2022

[Paper]    
Applications

Building open-domain dialogue systems capable of rich human-like conversational ability is one of the fundamental challenges in language generation. However, even with recent advancements in the field, existing open-domain generative models fail to capture and utilize external knowledge, leading to repetitive or generic responses to unseen utterances. Current work on knowledge-grounded dialogue generation primarily focuses on persona incorporation or searching a fact-based structured knowledge source such as Wikipedia. Our method takes a broader and simpler approach, which aims to improve the raw conversation ability of the system by mimicking the human response behavior through casual interactions found on social media. Utilizing a joint retriever-generator setup, the model queries a large set of filtered comment data from Reddit to act as additional context for the seq2seq generator. Automatic and human evaluations on open-domain dialogue datasets demonstrate the effectiveness of our approach.

Similar Work