The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations

Asking students to reflect on course content and ask questions about that content has been shown to improve comprehension in numerous domains. More recently, tools have been developed to store multiple-choice questions created by students in an online repository where they can be shared, evaluated a...

Full description

Bibliographic Details
Main Author: Luxton-Reilly, Andrew
Other Authors: Plimmer, Beryl
Published: ResearchSpace@Auckland 2012
Online Access:http://hdl.handle.net/2292/17625
id ndltd-AUCKLAND-oai-researchspace.auckland.ac.nz-2292-17625
record_format oai_dc
spelling ndltd-AUCKLAND-oai-researchspace.auckland.ac.nz-2292-176252012-07-03T11:37:37ZThe design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluationsLuxton-Reilly, AndrewAsking students to reflect on course content and ask questions about that content has been shown to improve comprehension in numerous domains. More recently, tools have been developed to store multiple-choice questions created by students in an online repository where they can be shared, evaluated and discussed with their peers. Although benefits are reported from the use of such systems, multiple-choice questions are not suitable for all teaching contexts: many instructors prefer to use free-response questions to assess learning. This thesis describes the development and evaluation of an online tool designed to support pedagogies involving student generated free-response questions. Using a design-based research methodology, an online tool named StudySieve was designed, implemented and evaluated in naturalistic educational settings. StudySieve was used in three large undergraduate Computer Science courses in which students developed questions with sample solutions, answered the questions contributed by their peers and evaluated both questions and answers. The effectiveness of free-response question generation and related activities is evaluated by considering the response from students, the nature of the questions created and the relationship between activity and subsequent exam performance. Students believe that the activities of question generation, answering questions and evaluating both questions and answers do help them learn. They rarely write more questions than required for assessment, but many students use the questions generated by their peers to practice prior to exams. The questions created by students reflect all the major topics of a course covered prior to the question generation activity, but the cognitive level of student generated questions tends to be lower than those generated by instructors. Although student question-generation activity is correlated with exam performance, the correlations can be explained by other levels of activity in a course. However, students who write questions on a topic sometimes have improved performance in exams on that topic compared with students who have not authored a question on the topic. Generating free-response questions is a valuable educational activity, but the exact conditions that result in improved exam performance require further investigation by future research.ResearchSpace@AucklandPlimmer, BerylHamer, JohnSheehan, Robert2012-04-30T03:55:57Z2012-04-30T03:55:57Z2012Thesishttp://hdl.handle.net/2292/17625PhD Thesis - University of AucklandUoA2250439Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. Previously published items are made available in accordance with the copyright policy of the publisher.https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htmCopyright: The author
collection NDLTD
sources NDLTD
description Asking students to reflect on course content and ask questions about that content has been shown to improve comprehension in numerous domains. More recently, tools have been developed to store multiple-choice questions created by students in an online repository where they can be shared, evaluated and discussed with their peers. Although benefits are reported from the use of such systems, multiple-choice questions are not suitable for all teaching contexts: many instructors prefer to use free-response questions to assess learning. This thesis describes the development and evaluation of an online tool designed to support pedagogies involving student generated free-response questions. Using a design-based research methodology, an online tool named StudySieve was designed, implemented and evaluated in naturalistic educational settings. StudySieve was used in three large undergraduate Computer Science courses in which students developed questions with sample solutions, answered the questions contributed by their peers and evaluated both questions and answers. The effectiveness of free-response question generation and related activities is evaluated by considering the response from students, the nature of the questions created and the relationship between activity and subsequent exam performance. Students believe that the activities of question generation, answering questions and evaluating both questions and answers do help them learn. They rarely write more questions than required for assessment, but many students use the questions generated by their peers to practice prior to exams. The questions created by students reflect all the major topics of a course covered prior to the question generation activity, but the cognitive level of student generated questions tends to be lower than those generated by instructors. Although student question-generation activity is correlated with exam performance, the correlations can be explained by other levels of activity in a course. However, students who write questions on a topic sometimes have improved performance in exams on that topic compared with students who have not authored a question on the topic. Generating free-response questions is a valuable educational activity, but the exact conditions that result in improved exam performance require further investigation by future research.
author2 Plimmer, Beryl
author_facet Plimmer, Beryl
Luxton-Reilly, Andrew
author Luxton-Reilly, Andrew
spellingShingle Luxton-Reilly, Andrew
The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations
author_sort Luxton-Reilly, Andrew
title The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations
title_short The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations
title_full The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations
title_fullStr The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations
title_full_unstemmed The design and evaluation of StudySieve: a tool that supports student-generated free-response questions, answers and evaluations
title_sort design and evaluation of studysieve: a tool that supports student-generated free-response questions, answers and evaluations
publisher ResearchSpace@Auckland
publishDate 2012
url http://hdl.handle.net/2292/17625
work_keys_str_mv AT luxtonreillyandrew thedesignandevaluationofstudysieveatoolthatsupportsstudentgeneratedfreeresponsequestionsanswersandevaluations
AT luxtonreillyandrew designandevaluationofstudysieveatoolthatsupportsstudentgeneratedfreeresponsequestionsanswersandevaluations
_version_ 1716390980852121600