Testing Writing on Computers
Computer use has grown rapidly during the past decade. Within the educational community, interest in authentic assessment has also increased. To enhance the authenticity of tests of writing, as well as of other knowledge and skills, some assessments require students to respond in written form via pa...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Arizona State University
1997-01-01
|
Series: | Education Policy Analysis Archives |
Online Access: | http://epaa.asu.edu/ojs/article/view/604 |
id |
doaj-4cafddc84c6f41029d61fb5600b1a04d |
---|---|
record_format |
Article |
spelling |
doaj-4cafddc84c6f41029d61fb5600b1a04d2020-11-25T03:11:48ZengArizona State UniversityEducation Policy Analysis Archives1068-23411997-01-0153Testing Writing on ComputersMichael RussellWalt HaneyComputer use has grown rapidly during the past decade. Within the educational community, interest in authentic assessment has also increased. To enhance the authenticity of tests of writing, as well as of other knowledge and skills, some assessments require students to respond in written form via paper-and-pencil. However, as increasing numbers of students grow accustomed to writing on computers, these assessments may yield underestimates of students' writing abilities. This article presents the findings of a small study examining the effect that mode of administration -- computer versus paper-and-pencil -- has on middle school students' performance on multiple-choice and written test questions. Findings show that, though multiple-choice test results do not differ much by mode of administration, for students accustomed to writing on computer, responses written on computer are substantially higher than those written by hand (effect size of 0.9 and relative success rates of 67% versus 30%). Implications are discussed in terms of both future research and test validity. http://epaa.asu.edu/ojs/article/view/604 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Michael Russell Walt Haney |
spellingShingle |
Michael Russell Walt Haney Testing Writing on Computers Education Policy Analysis Archives |
author_facet |
Michael Russell Walt Haney |
author_sort |
Michael Russell |
title |
Testing Writing on Computers |
title_short |
Testing Writing on Computers |
title_full |
Testing Writing on Computers |
title_fullStr |
Testing Writing on Computers |
title_full_unstemmed |
Testing Writing on Computers |
title_sort |
testing writing on computers |
publisher |
Arizona State University |
series |
Education Policy Analysis Archives |
issn |
1068-2341 |
publishDate |
1997-01-01 |
description |
Computer use has grown rapidly during the past decade. Within the educational community, interest in authentic assessment has also increased. To enhance the authenticity of tests of writing, as well as of other knowledge and skills, some assessments require students to respond in written form via paper-and-pencil. However, as increasing numbers of students grow accustomed to writing on computers, these assessments may yield underestimates of students' writing abilities. This article presents the findings of a small study examining the effect that mode of administration -- computer versus paper-and-pencil -- has on middle school students' performance on multiple-choice and written test questions. Findings show that, though multiple-choice test results do not differ much by mode of administration, for students accustomed to writing on computer, responses written on computer are substantially higher than those written by hand (effect size of 0.9 and relative success rates of 67% versus 30%). Implications are discussed in terms of both future research and test validity. |
url |
http://epaa.asu.edu/ojs/article/view/604 |
work_keys_str_mv |
AT michaelrussell testingwritingoncomputers AT walthaney testingwritingoncomputers |
_version_ |
1724652973847478272 |