A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias

Bibliographic Details
Main Author: Lewis, Marsha S.
Language:English
Published: Ohio University / OhioLINK 2013
Subjects:
Online Access:http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1386239081
id ndltd-OhioLink-oai-etd.ohiolink.edu-ohiou1386239081
record_format oai_dc
spelling ndltd-OhioLink-oai-etd.ohiolink.edu-ohiou13862390812021-08-03T06:20:51Z A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias Lewis, Marsha S. Educational Evaluation Education Policy Education Statistics attrition bias Monte Carlo simulation What Works Clearinghouse missing data This dissertation addresses the following overall research question: How do the amount and type of attrition, under varying assumptions of how much a subject's likelihood of dropping out of a study is related to his or her outcome, impact randomized controlled studies by contributing to systematic bias? The study first replicates a study conducted on behalf of the U.S. Department of Education's What Works Clearinghouse. Then, by applying a more systematic change in the magnitudes of the coefficients representing how much a subject's likelihood of dropping out of the study is correlated to his or her outcome and also varying the differential attrition rates, the study helps address the question: How sensitive is the measure of bias to changes in attrition rates and/or the relationship between outcome and a participant's propensity to respond in randomized controlled trials? The study also adds to the complexity of the bias modeling by addressing the question: How does varying the random error to simulate variations in the reliability of instruments used across studies impact the attrition thresholds?The methodology consisted of a series eight of Monte Carlo simulations (50,000 replications each) programmed in R. Each simulation varied one or more of the following components: the relationship (or correlation) between the outcome at follow-up for a study participant and his or her propensity to respond (or likelihood of not attriting from the study); the magnitude of differential attrition between the treatment and control groups; and the random error generated by the reliability of the outcome instruments.The sensitivity analyses indicate that the What Works Clearinghouse attrition bias model is sensitive to changes in the assumptions about the relationship between attrition and outcome. The patterns in the findings indicate that the difference in the relationship between the propensity to respond and outcome in the model are as important to the bias estimates as the overall and differential attrition. Modifying the attrition bias formula to allow for the relationship between the propensity to respond and outcome to be less impactful in the model until that relationship reaches a certain threshold overall magnitude may help provide some more specific guidance to reviewers who may be reviewing studies where, for example, they assume zero or near zero relationship of the propensity to respond to outcome in the control group. The conclusion reached by varying the random error term in the model in order to address the potential impact of reliability on the bias thresholds indicates that the WWC attrition bias thresholds may be somewhat sensitive to varying reliabilities of instruments across studies. This sensitivity may necessitate the development of more specific guidance for reviewers of certain types of studies for inclusion in the U.S. Department of Education's What Works Clearinghouse. 2013 English text Ohio University / OhioLINK http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1386239081 http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1386239081 unrestricted This thesis or dissertation is protected by copyright: all rights reserved. It may not be copied or redistributed beyond the terms of applicable copyright laws.
collection NDLTD
language English
sources NDLTD
topic Educational Evaluation
Education Policy
Education
Statistics
attrition
bias
Monte Carlo simulation
What Works Clearinghouse
missing data
spellingShingle Educational Evaluation
Education Policy
Education
Statistics
attrition
bias
Monte Carlo simulation
What Works Clearinghouse
missing data
Lewis, Marsha S.
A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias
author Lewis, Marsha S.
author_facet Lewis, Marsha S.
author_sort Lewis, Marsha S.
title A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias
title_short A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias
title_full A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias
title_fullStr A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias
title_full_unstemmed A Series of Sensitivity Analyses Examining the What Works Clearinghouse's Guidelines on Attrition Bias
title_sort series of sensitivity analyses examining the what works clearinghouse's guidelines on attrition bias
publisher Ohio University / OhioLINK
publishDate 2013
url http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1386239081
work_keys_str_mv AT lewismarshas aseriesofsensitivityanalysesexaminingthewhatworksclearinghousesguidelinesonattritionbias
AT lewismarshas seriesofsensitivityanalysesexaminingthewhatworksclearinghousesguidelinesonattritionbias
_version_ 1719434992194420736