Reproducible research and GIScience: an evaluation using AGILE conference papers
The demand for reproducible research is on the rise in disciplines concerned with data analysis and computational methods. Therefore, we reviewed current recommendations for reproducible research and translated them into criteria for assessing the reproducibility of articles in the field of geograph...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
PeerJ Inc.
2018-07-01
|
Series: | PeerJ |
Subjects: | |
Online Access: | https://peerj.com/articles/5072.pdf |
id |
doaj-27eca8a20238431c971c461de1c1f601 |
---|---|
record_format |
Article |
spelling |
doaj-27eca8a20238431c971c461de1c1f6012020-11-25T00:45:17ZengPeerJ Inc.PeerJ2167-83592018-07-016e507210.7717/peerj.5072Reproducible research and GIScience: an evaluation using AGILE conference papersDaniel Nüst0Carlos Granell1Barbara Hofer2Markus Konkol3Frank O. Ostermann4Rusne Sileryte5Valentina Cerutti6Institute for Geoinformatics, University of Münster, Münster, GermanyInstitute of New Imaging Technologies, Universitat Jaume I de Castellón, Castellón, SpainInterfaculty Department of Geoinformatics - Z_GIS, University of Salzburg, Salzburg, AustriaInstitute for Geoinformatics, University of Münster, Münster, GermanyFaculty of Geo-Information Science and Earth Observation (ITC), University of Twente, Enschede, The NetherlandsFaculty of Architecture and the Built Environment, Delft University of Technology, Delft, The NetherlandsFaculty of Geo-Information Science and Earth Observation (ITC), University of Twente, Enschede, The NetherlandsThe demand for reproducible research is on the rise in disciplines concerned with data analysis and computational methods. Therefore, we reviewed current recommendations for reproducible research and translated them into criteria for assessing the reproducibility of articles in the field of geographic information science (GIScience). Using this criteria, we assessed a sample of GIScience studies from the Association of Geographic Information Laboratories in Europe (AGILE) conference series, and we collected feedback about the assessment from the study authors. Results from the author feedback indicate that although authors support the concept of performing reproducible research, the incentives for doing this in practice are too small. Therefore, we propose concrete actions for individual researchers and the GIScience conference series to improve transparency and reproducibility. For example, to support researchers in producing reproducible work, the GIScience conference series could offer awards and paper badges, provide author guidelines for computational research, and publish articles in Open Access formats.https://peerj.com/articles/5072.pdfGIScienceOpen scienceReproducible researchData scienceAGILEReproducible conference publications |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Daniel Nüst Carlos Granell Barbara Hofer Markus Konkol Frank O. Ostermann Rusne Sileryte Valentina Cerutti |
spellingShingle |
Daniel Nüst Carlos Granell Barbara Hofer Markus Konkol Frank O. Ostermann Rusne Sileryte Valentina Cerutti Reproducible research and GIScience: an evaluation using AGILE conference papers PeerJ GIScience Open science Reproducible research Data science AGILE Reproducible conference publications |
author_facet |
Daniel Nüst Carlos Granell Barbara Hofer Markus Konkol Frank O. Ostermann Rusne Sileryte Valentina Cerutti |
author_sort |
Daniel Nüst |
title |
Reproducible research and GIScience: an evaluation using AGILE conference papers |
title_short |
Reproducible research and GIScience: an evaluation using AGILE conference papers |
title_full |
Reproducible research and GIScience: an evaluation using AGILE conference papers |
title_fullStr |
Reproducible research and GIScience: an evaluation using AGILE conference papers |
title_full_unstemmed |
Reproducible research and GIScience: an evaluation using AGILE conference papers |
title_sort |
reproducible research and giscience: an evaluation using agile conference papers |
publisher |
PeerJ Inc. |
series |
PeerJ |
issn |
2167-8359 |
publishDate |
2018-07-01 |
description |
The demand for reproducible research is on the rise in disciplines concerned with data analysis and computational methods. Therefore, we reviewed current recommendations for reproducible research and translated them into criteria for assessing the reproducibility of articles in the field of geographic information science (GIScience). Using this criteria, we assessed a sample of GIScience studies from the Association of Geographic Information Laboratories in Europe (AGILE) conference series, and we collected feedback about the assessment from the study authors. Results from the author feedback indicate that although authors support the concept of performing reproducible research, the incentives for doing this in practice are too small. Therefore, we propose concrete actions for individual researchers and the GIScience conference series to improve transparency and reproducibility. For example, to support researchers in producing reproducible work, the GIScience conference series could offer awards and paper badges, provide author guidelines for computational research, and publish articles in Open Access formats. |
topic |
GIScience Open science Reproducible research Data science AGILE Reproducible conference publications |
url |
https://peerj.com/articles/5072.pdf |
work_keys_str_mv |
AT danielnust reproducibleresearchandgiscienceanevaluationusingagileconferencepapers AT carlosgranell reproducibleresearchandgiscienceanevaluationusingagileconferencepapers AT barbarahofer reproducibleresearchandgiscienceanevaluationusingagileconferencepapers AT markuskonkol reproducibleresearchandgiscienceanevaluationusingagileconferencepapers AT frankoostermann reproducibleresearchandgiscienceanevaluationusingagileconferencepapers AT rusnesileryte reproducibleresearchandgiscienceanevaluationusingagileconferencepapers AT valentinacerutti reproducibleresearchandgiscienceanevaluationusingagileconferencepapers |
_version_ |
1725271074325659648 |