Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)

A meta-analysis is a tool for aggregating estimates of a similar “effect” across many studies. Publication bias is the phenomenon where literature is sample selected in favor of studies having statistically significant results and/or having estimates that satisfy pre-conceived expectations. A popula...

Full description

Bibliographic Details
Main Author: Sanghyun Hong
Format: Article
Language:English
Published: ZBW 2019-07-01
Series:International Journal for Re-Views in Empirical Economics
Subjects:
Online Access:https://doi.org/10.18718/81781.13
id doaj-6e9dc277ee8744b5b3ae9b212bb984b4
record_format Article
spelling doaj-6e9dc277ee8744b5b3ae9b212bb984b42020-11-25T01:52:48ZengZBWInternational Journal for Re-Views in Empirical Economics2566-82692566-82692019-07-0132019-412210.18718/81781.13Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)Sanghyun Hong0University of Canterbury, New ZealandA meta-analysis is a tool for aggregating estimates of a similar “effect” across many studies. Publication bias is the phenomenon where literature is sample selected in favor of studies having statistically significant results and/or having estimates that satisfy pre-conceived expectations. A popular procedure used for conducting meta-analyses in the presence of publication bias is the FAT-PET-PEESE (FPP) procedure. In a recent paper published in Research Synthesis Methods, Alinaghi and Reed (2018), utilizing Monte Carlo simulations, report that the FPP procedure does not work well when used in “realistic” data environments where true effects differ both across and within studies. AR’s findings are important because the FPP approach is dominant in the economics meta-analysis literature. I replicate their results and discover two mistakes, which I subsequently correct. The first mistake is found in a descriptive statistics table, misrepresenting the overview of simulated dataset. The second is associated with the fixed effect estimation, generating erroneous estimated effects and Type I error. Further, I extend their analysis by making their simulation environment evenmore realistic. Despite producing somewhat different results, my replications generally confirm AR’s conclusions about the unreliability of the FPP procedure in realistic data environments.https://doi.org/10.18718/81781.13meta-analysispublication biasfunnel asymmetry test (fat)precision effect estimate with standard error (peese)monte carlo simulationsreplication study
collection DOAJ
language English
format Article
sources DOAJ
author Sanghyun Hong
spellingShingle Sanghyun Hong
Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)
International Journal for Re-Views in Empirical Economics
meta-analysis
publication bias
funnel asymmetry test (fat)
precision effect estimate with standard error (peese)
monte carlo simulations
replication study
author_facet Sanghyun Hong
author_sort Sanghyun Hong
title Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)
title_short Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)
title_full Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)
title_fullStr Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)
title_full_unstemmed Meta-analysis and publication bias: How well does the FAT-PET-PEESE procedure work? A replication study of Alinaghi & Reed (Research Synthesis Methods, 2018)
title_sort meta-analysis and publication bias: how well does the fat-pet-peese procedure work? a replication study of alinaghi & reed (research synthesis methods, 2018)
publisher ZBW
series International Journal for Re-Views in Empirical Economics
issn 2566-8269
2566-8269
publishDate 2019-07-01
description A meta-analysis is a tool for aggregating estimates of a similar “effect” across many studies. Publication bias is the phenomenon where literature is sample selected in favor of studies having statistically significant results and/or having estimates that satisfy pre-conceived expectations. A popular procedure used for conducting meta-analyses in the presence of publication bias is the FAT-PET-PEESE (FPP) procedure. In a recent paper published in Research Synthesis Methods, Alinaghi and Reed (2018), utilizing Monte Carlo simulations, report that the FPP procedure does not work well when used in “realistic” data environments where true effects differ both across and within studies. AR’s findings are important because the FPP approach is dominant in the economics meta-analysis literature. I replicate their results and discover two mistakes, which I subsequently correct. The first mistake is found in a descriptive statistics table, misrepresenting the overview of simulated dataset. The second is associated with the fixed effect estimation, generating erroneous estimated effects and Type I error. Further, I extend their analysis by making their simulation environment evenmore realistic. Despite producing somewhat different results, my replications generally confirm AR’s conclusions about the unreliability of the FPP procedure in realistic data environments.
topic meta-analysis
publication bias
funnel asymmetry test (fat)
precision effect estimate with standard error (peese)
monte carlo simulations
replication study
url https://doi.org/10.18718/81781.13
work_keys_str_mv AT sanghyunhong metaanalysisandpublicationbiashowwelldoesthefatpetpeeseprocedureworkareplicationstudyofalinaghireedresearchsynthesismethods2018
_version_ 1724992952201117696