Training Optimus Prime, M.D.: Generating Medical Certification Items By Fine-tuning Openai's Gpt2 Transformer Model · The Large Language Model Bible Contribute to LLM-Bible

Training Optimus Prime, M.D.: Generating Medical Certification Items By Fine-tuning Openai's Gpt2 Transformer Model

Von Davier Matthias. Arxiv 2019

[Paper]    
Fine Tuning GPT Model Architecture Pretraining Methods Training Techniques Transformer

This article describes new results of an application using transformer-based language models to automated item generation (AIG), an area of ongoing interest in the domain of certification testing as well as in educational measurement and psychological testing. OpenAI’s gpt2 pre-trained 345M parameter language model was retrained using the public domain text mining set of PubMed articles and subsequently used to generate item stems (case vignettes) as well as distractor proposals for multiple-choice items. This case study shows promise and produces draft text that can be used by human item writers as input for authoring. Future experiments with more recent transformer models (such as Grover, TransformerXL) using existing item pools are expected to improve results further and to facilitate the development of assessment materials.

Similar Work