Scalable Educational Question Generation With Pre-trained Language Models · The Large Language Model Bible Contribute to LLM-Bible

Scalable Educational Question Generation With Pre-trained Language Models

Sahan Bulathwela, Hamze Muse, Emine Yilmaz. Arxiv 2023 – 21 citations

[Paper]    
Pre-Training Training Techniques Fine-Tuning

The automatic generation of educational questions will play a key role in scaling online education, enabling self-assessment at scale when a global population is manoeuvring their personalised learning journeys. We develop \textit{EduQG}, a novel educational question generation model built by adapting a large language model. Our extensive experiments demonstrate that \textit{EduQG} can produce superior educational questions by further pre-training and fine-tuning a pre-trained language model on the scientific text and science question data.

Similar Work