E2TP: Element To Tuple Prompting Improves Aspect Sentiment Tuple Prediction · The Large Language Model Bible Contribute to LLM-Bible

E2TP: Element To Tuple Prompting Improves Aspect Sentiment Tuple Prediction

Mohammadkhani Mohammad Ghiasvand, Ranjbar Niloofar, Momtazi Saeedeh. Arxiv 2024

[Paper]    
Attention Mechanism Model Architecture Prompting Reinforcement Learning Training Techniques

Generative approaches have significantly influenced Aspect-Based Sentiment Analysis (ABSA), garnering considerable attention. However, existing studies often predict target text components monolithically, neglecting the benefits of utilizing single elements for tuple prediction. In this paper, we introduce Element to Tuple Prompting (E2TP), employing a two-step architecture. The former step focuses on predicting single elements, while the latter step completes the process by mapping these predicted elements to their corresponding tuples. E2TP is inspired by human problem-solving, breaking down tasks into manageable parts, using the first step’s output as a guide in the second step. Within this strategy, three types of paradigms, namely E2TP(\(diet\)), E2TP(\(f_1\)), and E2TP(\(f_2\)), are designed to facilitate the training process. Beyond dataset-specific experiments, our paper addresses cross-domain scenarios, demonstrating the effectiveness and generalizability of the approach. By conducting a comprehensive analysis on various benchmarks, we show that E2TP achieves new state-of-the-art results in nearly all cases.

Similar Work