NAP At Semeval-2023 Task 3: Is Less Really More? (back-)translation As Data Augmentation Strategies For Detecting Persuasion Techniques · The Large Language Model Bible Contribute to LLM-Bible

NAP At Semeval-2023 Task 3: Is Less Really More? (back-)translation As Data Augmentation Strategies For Detecting Persuasion Techniques

Falk Neele, Eichel Annerose, Piccirilli Prisca. Arxiv 2023

[Paper]    
Model Architecture Pretraining Methods RAG Training Techniques Transformer

Persuasion techniques detection in news in a multi-lingual setup is non-trivial and comes with challenges, including little training data. Our system successfully leverages (back-)translation as data augmentation strategies with multi-lingual transformer models for the task of detecting persuasion techniques. The automatic and human evaluation of our augmented data allows us to explore whether (back-)translation aid or hinder performance. Our in-depth analyses indicate that both data augmentation strategies boost performance; however, balancing human-produced and machine-generated data seems to be crucial.

Similar Work