DOI
https://doi.org/10.25772/89T1-3X75
Defense Date
2023
Document Type
Dissertation
Degree Name
Doctor of Philosophy
Department
Biostatistics
First Advisor
Robert Perera
Abstract
In 2015, Open Science Framework directly replicated 100 psychology studies and found astonishingly low replication rates. Since, researchers have suggested factors that may have influenced the low rates, including the metrics used to assess replications. The definitions used to decide whether a replication study was successful all suffer from flaws. Therefore, we propose a new metric for assessing replication that can estimate the likelihood a study successfully replicated rather than forcing a binary choice and accounts for study design limitations.
Using equivalence study techniques, we first propose a new metric to assess replication, defining a successful replication as one where either the replicated study’s effect size or difference in the original or replicated effect size falls within the preset equivalence margin. We then compare our metric to current metrics using the Reproducibility Project data. Following this, we extend this approach to multiple studies using multivariate methods. Lastly, we design a survey to assess replication qualitatively.
We found, when assessing replication on a continuous scale more information on a study’s probability of replication is provided. Additionally, a study’s probability of replication is highly impacted by the study’s design elements such as sample size. When extending the equivalence metric to multiple studies, the replication probabilities decreased as the variance between studies played a larger role.
Using equivalence studies to assess replication allows replication success to fall on a continuous scale providing more information while having the ability to assess the impact study’s design elements have on replication rates.
Keywords: Replication, Underpowered Studies, Publication Bias, Equivalence Studies
Rights
© The Author
Is Part Of
VCU University Archives
Is Part Of
VCU Theses and Dissertations
Date of Submission
8-2-2023