Quick Starting Dialog Systems With Paraphrase Generation · The Large Language Model Bible Contribute to LLM-Bible

Quick Starting Dialog Systems With Paraphrase Generation

Marceau Louis, Belbahar Raouf, Queudot Marc, Naji Nada, Charton Eric, Meurs Marie-jean. Arxiv 2022

[Paper]    
Agentic Applications Model Architecture Pretraining Methods Security Training Techniques Transformer

Acquiring training data to improve the robustness of dialog systems can be a painstakingly long process. In this work, we propose a method to reduce the cost and effort of creating new conversational agents by artificially generating more data from existing examples, using paraphrase generation. Our proposed approach can kick-start a dialog system with little human effort, and brings its performance to a level satisfactory enough for allowing actual interactions with real end-users. We experimented with two neural paraphrasing approaches, namely Neural Machine Translation and a Transformer-based seq2seq model. We present the results obtained with two datasets in English and in French:~a crowd-sourced public intent classification dataset and our own corporate dialog system dataset. We show that our proposed approach increased the generalization capabilities of the intent classification model on both datasets, reducing the effort required to initialize a new dialog system and helping to deploy this technology at scale within an organization.

Similar Work