Measurements of student understanding on complex scientific reasoning problems
While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student r...
Main Author: | |
---|---|
Language: | ENG |
Published: |
ScholarWorks@UMass Amherst
2004
|
Subjects: | |
Online Access: | https://scholarworks.umass.edu/dissertations/AAI3118308 |
id |
ndltd-UMASS-oai-scholarworks.umass.edu-dissertations-2286 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UMASS-oai-scholarworks.umass.edu-dissertations-22862020-12-02T14:30:52Z Measurements of student understanding on complex scientific reasoning problems Izumi, Alisa Sau-Lin While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors—m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures—VSAT, MSAT, high school grade point average, or final course grade—the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of further research and development in the area of assessment of scientific reasoning. 2004-01-01T08:00:00Z text https://scholarworks.umass.edu/dissertations/AAI3118308 Doctoral Dissertations Available from Proquest ENG ScholarWorks@UMass Amherst Educational evaluation|Science education |
collection |
NDLTD |
language |
ENG |
sources |
NDLTD |
topic |
Educational evaluation|Science education |
spellingShingle |
Educational evaluation|Science education Izumi, Alisa Sau-Lin Measurements of student understanding on complex scientific reasoning problems |
description |
While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors—m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures—VSAT, MSAT, high school grade point average, or final course grade—the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of further research and development in the area of assessment of scientific reasoning. |
author |
Izumi, Alisa Sau-Lin |
author_facet |
Izumi, Alisa Sau-Lin |
author_sort |
Izumi, Alisa Sau-Lin |
title |
Measurements of student understanding on complex scientific reasoning problems |
title_short |
Measurements of student understanding on complex scientific reasoning problems |
title_full |
Measurements of student understanding on complex scientific reasoning problems |
title_fullStr |
Measurements of student understanding on complex scientific reasoning problems |
title_full_unstemmed |
Measurements of student understanding on complex scientific reasoning problems |
title_sort |
measurements of student understanding on complex scientific reasoning problems |
publisher |
ScholarWorks@UMass Amherst |
publishDate |
2004 |
url |
https://scholarworks.umass.edu/dissertations/AAI3118308 |
work_keys_str_mv |
AT izumialisasaulin measurementsofstudentunderstandingoncomplexscientificreasoningproblems |
_version_ |
1719364020294647808 |