Prompt-engineering And Transformer-based Question Generation And Evaluation · The Large Language Model Bible Contribute to LLM-Bible

Prompt-engineering And Transformer-based Question Generation And Evaluation

Amyeen Rubaba. Arxiv 2023

[Paper]    
Applications BERT Model Architecture Pretraining Methods Prompting RAG Training Techniques Transformer

Question generation has numerous applications in the educational context. Question generation can prove helpful for students when reviewing content and testing themselves. Furthermore, a question generation model can aid teachers by lessening the burden of creating assessments and other practice material. This paper aims to find the best method to generate questions from textual data through a transformer model and prompt engineering. In this research, we finetuned a pretrained distilBERT model on the SQuAD question answering dataset to generate questions. In addition to training a transformer model, prompt engineering was applied to generate questions effectively using the LLaMA model. The generated questions were compared against the baseline questions in the SQuAD dataset to evaluate the effectiveness of four different prompts. All four prompts demonstrated over 60% similarity on average. Of the prompt-generated questions, 30% achieved a high similarity score greater than 70%.

Similar Work