Summary: | Introduction: The aim of the study presented here was to evaluate the reliability of an OSPE end-of-semester exam in the phantom course for operative dentistry in Frankfurt am Main taking into consideration different modes of evaluation (examiner’s checklist versus instructor’s manual) and number of examiners (three versus four).Methods: In an historic, monocentric, comparative study, two different methods of evaluation were examined in a real end-of-semester setting held in OSPE form (Group I: exclusive use of an examiner’s checklist versus Group II: use of an examiner’s checklist including an instructor’s manual). For the analysis of interrater reliability, the generalisability theory was applied that contains a generalisation of the concept of internal consistency (Cronbach’s alpha). Results: The results show that the exclusive use of the examiner’s checklist led to higher interrater reliability values than the in-depth instructor’s manual used in addition to the list.Conclusion: In summary it can be said that the examiner’s checklists used in the present study, without the instructor’s manual, resulted in the highest interrater reliability in combination with three evaluators within the context of the completed OSPE.
|