Transaction: ICL-SJTU Submission To Epic-kitchens Action Anticipation Challenge 2021 · The Large Language Model Bible Contribute to LLM-Bible

Transaction: ICL-SJTU Submission To Epic-kitchens Action Anticipation Challenge 2021

Gu Xiao, Qiu Jianing, Guo Yao, Lo Benny, Yang Guang-zhong. Arxiv 2021

[Paper]    
Attention Mechanism Model Architecture Pretraining Methods RAG Transformer

In this report, the technical details of our submission to the EPIC-Kitchens Action Anticipation Challenge 2021 are given. We developed a hierarchical attention model for action anticipation, which leverages Transformer-based attention mechanism to aggregate features across temporal dimension, modalities, symbiotic branches respectively. In terms of Mean Top-5 Recall of action, our submission with team name ICL-SJTU achieved 13.39% for overall testing set, 10.05% for unseen subsets and 11.88% for tailed subsets. Additionally, it is noteworthy that our submission ranked 1st in terms of verb class in all three (sub)sets.

Similar Work