Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items
Asymmetric IRT models have been shown useful for capturing heterogeneity in the number of latent subprocesses underlying educational test items (Lee and Bolt, 2018a). One potentially useful practical application of such models is toward the scoring of discrete-option multiple-choice (DOMC) items. Un...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2018-11-01
|
Series: | Frontiers in Psychology |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fpsyg.2018.02175/full |
id |
doaj-998d352011894f1a975ef05b2ca3606c |
---|---|
record_format |
Article |
spelling |
doaj-998d352011894f1a975ef05b2ca3606c2020-11-24T23:55:58ZengFrontiers Media S.A.Frontiers in Psychology1664-10782018-11-01910.3389/fpsyg.2018.02175419178Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test ItemsDaniel M. Bolt0Sora Lee1James Wollack2Carol Eckerly3John Sowles4Department of Educational Psychology, University of Wisconsin-Madison, Madison, WI, United StatesDepartment of Educational Psychology, University of Wisconsin-Madison, Madison, WI, United StatesDepartment of Educational Psychology, University of Wisconsin-Madison, Madison, WI, United StatesEducational Testing Service, Princeton, NJ, United StatesEricsson, Inc., Santa Clara, CA, United StatesAsymmetric IRT models have been shown useful for capturing heterogeneity in the number of latent subprocesses underlying educational test items (Lee and Bolt, 2018a). One potentially useful practical application of such models is toward the scoring of discrete-option multiple-choice (DOMC) items. Under the DOMC format, response options are independently and randomly administered up to the (last) keyed response, and thus the scheduled number of distractor response options to which an examinee may be exposed (and consequently the overall difficulty of the item) can vary. In this paper we demonstrate the applicability of Samejima's logistic positive exponent (LPE) model to response data from an information technology certification test administered using the DOMC format, and discuss its advantages relative to a two-parameter logistic (2PL) model in addressing such effects. Application of the LPE in the context of DOMC items is shown to (1) provide reduced complexity and a superior comparative fit relative to the 2PL, and (2) yield a latent metric with reduced shrinkage at high proficiency levels. The results support the potential use of the LPE as a basis for scoring DOMC items so as to account for effects related to key location.https://www.frontiersin.org/article/10.3389/fpsyg.2018.02175/fullitem response theory (IRT)computerized testingmultiple-choicelatent ability estimatesSamejima's logistic positive exponent (LPE) model |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Daniel M. Bolt Sora Lee James Wollack Carol Eckerly John Sowles |
spellingShingle |
Daniel M. Bolt Sora Lee James Wollack Carol Eckerly John Sowles Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items Frontiers in Psychology item response theory (IRT) computerized testing multiple-choice latent ability estimates Samejima's logistic positive exponent (LPE) model |
author_facet |
Daniel M. Bolt Sora Lee James Wollack Carol Eckerly John Sowles |
author_sort |
Daniel M. Bolt |
title |
Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items |
title_short |
Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items |
title_full |
Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items |
title_fullStr |
Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items |
title_full_unstemmed |
Application of Asymmetric IRT Modeling to Discrete-Option Multiple-Choice Test Items |
title_sort |
application of asymmetric irt modeling to discrete-option multiple-choice test items |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Psychology |
issn |
1664-1078 |
publishDate |
2018-11-01 |
description |
Asymmetric IRT models have been shown useful for capturing heterogeneity in the number of latent subprocesses underlying educational test items (Lee and Bolt, 2018a). One potentially useful practical application of such models is toward the scoring of discrete-option multiple-choice (DOMC) items. Under the DOMC format, response options are independently and randomly administered up to the (last) keyed response, and thus the scheduled number of distractor response options to which an examinee may be exposed (and consequently the overall difficulty of the item) can vary. In this paper we demonstrate the applicability of Samejima's logistic positive exponent (LPE) model to response data from an information technology certification test administered using the DOMC format, and discuss its advantages relative to a two-parameter logistic (2PL) model in addressing such effects. Application of the LPE in the context of DOMC items is shown to (1) provide reduced complexity and a superior comparative fit relative to the 2PL, and (2) yield a latent metric with reduced shrinkage at high proficiency levels. The results support the potential use of the LPE as a basis for scoring DOMC items so as to account for effects related to key location. |
topic |
item response theory (IRT) computerized testing multiple-choice latent ability estimates Samejima's logistic positive exponent (LPE) model |
url |
https://www.frontiersin.org/article/10.3389/fpsyg.2018.02175/full |
work_keys_str_mv |
AT danielmbolt applicationofasymmetricirtmodelingtodiscreteoptionmultiplechoicetestitems AT soralee applicationofasymmetricirtmodelingtodiscreteoptionmultiplechoicetestitems AT jameswollack applicationofasymmetricirtmodelingtodiscreteoptionmultiplechoicetestitems AT caroleckerly applicationofasymmetricirtmodelingtodiscreteoptionmultiplechoicetestitems AT johnsowles applicationofasymmetricirtmodelingtodiscreteoptionmultiplechoicetestitems |
_version_ |
1725460375843897344 |