Test format effects: a componential approach to second language reading

Abstract Background This study aims to empirically answer the question of whether the role of sub-reading skills changes depending on the test format (e.g., multiple-choice vs. open-ended reading questions). The test format effect also addresses the issue of test validity—whether the reading test pr...

Full description

Bibliographic Details
Main Author: Hyojung Lim
Format: Article
Language:English
Published: SpringerOpen 2019-04-01
Series:Language Testing in Asia
Subjects:
Online Access:http://link.springer.com/article/10.1186/s40468-019-0082-y
Description
Summary:Abstract Background This study aims to empirically answer the question of whether the role of sub-reading skills changes depending on the test format (e.g., multiple-choice vs. open-ended reading questions). The test format effect also addresses the issue of test validity—whether the reading test properly elicits construct-relevant reading skills or ability. The research questions guiding the study are as follows: (1) Do test scores differ systematically depending on the test format? (2) Do the predictors of test scores differ systematically depending on the test format? Methods Ninety Chinese ESL students participated in the study at the post-secondary level and took two TOEFL practice testlets, one with multiple-choice (MC) questions and the other with stem-equivalent open-ended (OE) questions. In addition to the reading comprehension test, the participants completed a vocabulary test, grammar test, word recognition task, sentence processing task, working memory test, and strategy questionnaires (reading and test-taking strategies). Results The participants performed better on the MC questions than the corresponding OE questions, regardless of the text effect. More importantly, an L2 reading test in a different format involved different sub-reading components; vocabulary knowledge was the only significant predictor of MC test scores, whereas for the OE reading test, grammar knowledge, word recognition skills, and possibly inferencing strategies were found to be significant predictors. Conclusion Despite a number of limitations, the value of this study lies in the effort to empirically test format effects by taking a componential approach to reading. The findings suggest the possibility that differently formatted reading questions may tap into different sub-reading component skills. To accurately reveal the underlying structure of the reading construct being tested in MC and OE tests, however, we call for a larger scale data collection with mixed research methods employed.
ISSN:2229-0443