Medical students create multiple-choice questions for learning in pathology education: a pilot study

Abstract Background Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medi...

Full description

Bibliographic Details
Main Authors: Rebecca Grainger, Wei Dai, Emma Osborne, Diane Kenwright
Format: Article
Language:English
Published: BMC 2018-08-01
Series:BMC Medical Education
Subjects:
Online Access:http://link.springer.com/article/10.1186/s12909-018-1312-1
id doaj-f5b99e1757434658a537af0c4d1a2c0f
record_format Article
spelling doaj-f5b99e1757434658a537af0c4d1a2c0f2020-11-25T03:57:27ZengBMCBMC Medical Education1472-69202018-08-011811810.1186/s12909-018-1312-1Medical students create multiple-choice questions for learning in pathology education: a pilot studyRebecca Grainger0Wei Dai1Emma Osborne2Diane Kenwright3Department of Pathology and Molecular Medicine, University of Otago WellingtonDepartment of Pathology and Molecular Medicine, University of Otago WellingtonHigher Education Development Centre, University of Otago WellingtonDepartment of Pathology and Molecular Medicine, University of Otago WellingtonAbstract Background Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer. Methods Students in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students. Results Students were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise. Conclusions Our study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other’s MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students.http://link.springer.com/article/10.1186/s12909-018-1312-1Student-generated MCQMultiple-choice questionsAssessment for learningPeerWiseBloom’s taxonomyPeer-instruction
collection DOAJ
language English
format Article
sources DOAJ
author Rebecca Grainger
Wei Dai
Emma Osborne
Diane Kenwright
spellingShingle Rebecca Grainger
Wei Dai
Emma Osborne
Diane Kenwright
Medical students create multiple-choice questions for learning in pathology education: a pilot study
BMC Medical Education
Student-generated MCQ
Multiple-choice questions
Assessment for learning
PeerWise
Bloom’s taxonomy
Peer-instruction
author_facet Rebecca Grainger
Wei Dai
Emma Osborne
Diane Kenwright
author_sort Rebecca Grainger
title Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_short Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_full Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_fullStr Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_full_unstemmed Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_sort medical students create multiple-choice questions for learning in pathology education: a pilot study
publisher BMC
series BMC Medical Education
issn 1472-6920
publishDate 2018-08-01
description Abstract Background Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer. Methods Students in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students. Results Students were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise. Conclusions Our study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other’s MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students.
topic Student-generated MCQ
Multiple-choice questions
Assessment for learning
PeerWise
Bloom’s taxonomy
Peer-instruction
url http://link.springer.com/article/10.1186/s12909-018-1312-1
work_keys_str_mv AT rebeccagrainger medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
AT weidai medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
AT emmaosborne medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
AT dianekenwright medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
_version_ 1724460711562706944