Comparing Generative Chatbots Based On Process Requirements · The Large Language Model Bible Contribute to LLM-Bible

Comparing Generative Chatbots Based On Process Requirements

Lins Luis Fernando, Nascimento Nathalia, Alencar Paulo, Oliveira Toacy, Cowan Donald. Arxiv 2023

[Paper]    
GPT Model Architecture Pretraining Methods Reinforcement Learning Transformer

Business processes are commonly represented by modelling languages, such as Event-driven Process Chain (EPC), Yet Another Workflow Language (YAWL), and the most popular standard notation for modelling business processes, the Business Process Model and Notation (BPMN). Most recently, chatbots, programs that allow users to interact with a machine using natural language, have been increasingly used for business process execution support. A recent category of chatbots worth mentioning is generative-based chatbots, powered by Large Language Models (LLMs) such as OpenAI’s Generative Pre-Trained Transformer (GPT) model and Google’s Pathways Language Model (PaLM), which are trained on billions of parameters and support conversational intelligence. However, it is not clear whether generative-based chatbots are able to understand and meet the requirements of constructs such as those provided by BPMN for process execution support. This paper presents a case study to compare the performance of prominent generative models, GPT and PaLM, in the context of process execution support. The research sheds light into the challenging problem of using conversational approaches supported by generative chatbots as a means to understand process-aware modelling notations and support users to execute their tasks.

Similar Work