GNOME: Generating Negotiations Through Open-domain Mapping Of Exchanges · The Large Language Model Bible Contribute to LLM-Bible

GNOME: Generating Negotiations Through Open-domain Mapping Of Exchanges

Deshpande Darshan, Sinha Shambhavi, Kumar Anirudh Ravi, Pal Debaditya, May Jonathan. Arxiv 2024

[Paper]    
Pretraining Methods Tools Training Techniques

Language Models have previously shown strong negotiation capabilities in closed domains where the negotiation strategy prediction scope is constrained to a specific setup. In this paper, we first show that these models are not generalizable beyond their original training domain despite their wide-scale pretraining. Following this, we propose an automated framework called GNOME, which processes existing human-annotated, closed-domain datasets using Large Language Models and produces synthetic open-domain dialogues for negotiation. GNOME improves the generalizability of negotiation systems while reducing the expensive and subjective task of manual data curation. Through our experimental setup, we create a benchmark comparing encoder and decoder models trained on existing datasets against datasets created through GNOME. Our results show that models trained on our dataset not only perform better than previous state of the art models on domain specific strategy prediction, but also generalize better to previously unseen domains.

Similar Work