Predicting Issue Types With Sebert · The Large Language Model Bible Contribute to LLM-Bible

Predicting Issue Types With Sebert

Trautsch Alexander, Herbold Steffen. Arxiv 2022

[Paper]    
BERT Model Architecture Pretraining Methods Transformer

Pre-trained transformer models are the current state-of-the-art for natural language models processing. seBERT is such a model, that was developed based on the BERT architecture, but trained from scratch with software engineering data. We fine-tuned this model for the NLBSE challenge for the task of issue type prediction. Our model dominates the baseline fastText for all three issue types in both recall and precisio} to achieve an overall F1-score of 85.7%, which is an increase of 4.1% over the baseline.

Similar Work