The Conversation Is The Command: Interacting With Real-world Autonomous Robot Through Natural Language · The Large Language Model Bible Contribute to LLM-Bible

The Conversation Is The Command: Interacting With Real-world Autonomous Robot Through Natural Language

Nwankwo Linus, Rueckert Elmar. HRI 2024

[Paper] [Code] [Code]    
Agent Agentic Applications Has Code Multimodal Models RAG Reinforcement Learning

In recent years, autonomous agents have surged in real-world environments such as our homes, offices, and public spaces. However, natural human-robot interaction remains a key challenge. In this paper, we introduce an approach that synergistically exploits the capabilities of large language models (LLMs) and multimodal vision-language models (VLMs) to enable humans to interact naturally with autonomous robots through conversational dialogue. We leveraged the LLMs to decode the high-level natural language instructions from humans and abstract them into precise robot actionable commands or queries. Further, we utilised the VLMs to provide a visual and semantic understanding of the robot’s task environment. Our results with 99.13% command recognition accuracy and 97.96% commands execution success show that our approach can enhance human-robot interaction in real-world applications. The video demonstrations of this paper can be found at https://osf.io/wzyf6 and the code is available at our GitHub repository (https://github.com/LinusNEP/TCC_IRoNL.git).

Similar Work