Supplementary material from "Analytic reproducibility in articles receiving open data badges at psychological science: an observational study"

Posted on 23.12.2020 - 14:03
For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy' (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

CITE THIS COLLECTION

Select your citation style and then place your mouse over the citation text to select it.
Hardwicke, Tom E.; Bohn, Manuel; MacDonald, Kyle; Hembacher, Emily; Nuijten, Michèle B.; Peloquin, Benjamin N.; et al. (2020): Supplementary material from "Analytic reproducibility in articles receiving open data badges at psychological science: an observational study". The Royal Society. Collection. https://doi.org/10.6084/m9.figshare.c.5249529.v1
or

SHARE

email

Usage metrics

Royal Society Open Science

AUTHORS (10)

Tom E. Hardwicke
Manuel Bohn
Kyle MacDonald
Emily Hembacher
Michèle B. Nuijten
Benjamin N. Peloquin
Benjamin E. deMayo
Bria Long
Erica J. Yoon
Michael C. Frank
need help?