Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study

Gregory Gilmour,1 James McKivigan2 1Physical Medicine and Rehabilitation, Michigan State University, Lansing, MI, 2School of Physical Therapy, Touro University, Henderson, NV, USA Introduction: Historically, testing medical students’ skills using a handheld ophthalmoscope has been difficul...

Full description

Bibliographic Details
Main Authors: Gilmour G, McKivigan J
Format: Article
Language:English
Published: Dove Medical Press 2016-12-01
Series:Advances in Medical Education and Practice
Subjects:
Online Access:https://www.dovepress.com/evaluating-medical-studentsrsquo-proficiency-with-a-handheld-ophthalmo-peer-reviewed-article-AMEP
id doaj-6d1c496f1b1b401f88298d55f04200bb
record_format Article
spelling doaj-6d1c496f1b1b401f88298d55f04200bb2020-11-25T00:19:01ZengDove Medical PressAdvances in Medical Education and Practice1179-72582016-12-01Volume 8333630669Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot studyGilmour GMcKivigan JGregory Gilmour,1 James McKivigan2 1Physical Medicine and Rehabilitation, Michigan State University, Lansing, MI, 2School of Physical Therapy, Touro University, Henderson, NV, USA Introduction: Historically, testing medical students’ skills using a handheld ophthalmoscope has been difficult to do objectively. Many programs train students using plastic models of the eye which are a very limited fidelity simulator of a real human eye. This makes it difficult to be sure that actual proficiency is attained given the differences between the various models and actual patients. The purpose of this article is to introduce a method of testing where a medical student must match a patient with his/her fundus photo, ensuring objective evaluation as well as developing skills on real patients which are more likely to transfer into clinical practice directly. Presentation of case: Fundus photos from standardized patients (SPs) were obtained using a retinal camera and placed into a grid using proprietary software. Medical students were then asked to examine a SP and attempt to match the patient to his/her fundus photo in the grid. Results: Of the 33 medical students tested, only 10 were able to match the SP’s eye to the correct photo in the grid. The average time to correct selection was 175 seconds, and the successful students rated their confidence level at 27.5% (average). The incorrect selection took less time, averaging 118 seconds, yet yielded a higher student-reported confidence level at 34.8% (average). The only noteworthy predictor of success (p<0.05) was the student’s age (p=0.02). Conclusion: It may be determined that there is an apparent gap in the ophthalmoscopy training of the students tested. It may also be of concern that students who selected the incorrect photo were more confident in their selections than students who chose the correct photo. More training may be necessary to close this gap, and future studies should attempt to establish continuing protocols in multiple centers. Keywords: standardized patient, software based, physical exam, computer-based testing, educationhttps://www.dovepress.com/evaluating-medical-studentsrsquo-proficiency-with-a-handheld-ophthalmo-peer-reviewed-article-AMEPophthalmoscopemedical studentphysical examcomputer-based testingeducation
collection DOAJ
language English
format Article
sources DOAJ
author Gilmour G
McKivigan J
spellingShingle Gilmour G
McKivigan J
Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
Advances in Medical Education and Practice
ophthalmoscope
medical student
physical exam
computer-based testing
education
author_facet Gilmour G
McKivigan J
author_sort Gilmour G
title Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
title_short Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
title_full Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
title_fullStr Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
title_full_unstemmed Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
title_sort evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study
publisher Dove Medical Press
series Advances in Medical Education and Practice
issn 1179-7258
publishDate 2016-12-01
description Gregory Gilmour,1 James McKivigan2 1Physical Medicine and Rehabilitation, Michigan State University, Lansing, MI, 2School of Physical Therapy, Touro University, Henderson, NV, USA Introduction: Historically, testing medical students’ skills using a handheld ophthalmoscope has been difficult to do objectively. Many programs train students using plastic models of the eye which are a very limited fidelity simulator of a real human eye. This makes it difficult to be sure that actual proficiency is attained given the differences between the various models and actual patients. The purpose of this article is to introduce a method of testing where a medical student must match a patient with his/her fundus photo, ensuring objective evaluation as well as developing skills on real patients which are more likely to transfer into clinical practice directly. Presentation of case: Fundus photos from standardized patients (SPs) were obtained using a retinal camera and placed into a grid using proprietary software. Medical students were then asked to examine a SP and attempt to match the patient to his/her fundus photo in the grid. Results: Of the 33 medical students tested, only 10 were able to match the SP’s eye to the correct photo in the grid. The average time to correct selection was 175 seconds, and the successful students rated their confidence level at 27.5% (average). The incorrect selection took less time, averaging 118 seconds, yet yielded a higher student-reported confidence level at 34.8% (average). The only noteworthy predictor of success (p<0.05) was the student’s age (p=0.02). Conclusion: It may be determined that there is an apparent gap in the ophthalmoscopy training of the students tested. It may also be of concern that students who selected the incorrect photo were more confident in their selections than students who chose the correct photo. More training may be necessary to close this gap, and future studies should attempt to establish continuing protocols in multiple centers. Keywords: standardized patient, software based, physical exam, computer-based testing, education
topic ophthalmoscope
medical student
physical exam
computer-based testing
education
url https://www.dovepress.com/evaluating-medical-studentsrsquo-proficiency-with-a-handheld-ophthalmo-peer-reviewed-article-AMEP
work_keys_str_mv AT gilmourg evaluatingmedicalstudentsrsquoproficiencywithahandheldophthalmoscopeapilotstudy
AT mckiviganj evaluatingmedicalstudentsrsquoproficiencywithahandheldophthalmoscopeapilotstudy
_version_ 1725373835637686272