HOME

TheInfoList



OR:

The Reproducibility Project: Psychology was a crowdsourced collaboration of 270 contributing authors to repeat 100 published experimental and correlational psychological studies. This project was led by the
Center for Open Science The Center for Open Science is a non-profit technology organization based in Charlottesville, Virginia with a mission to "increase the openness, integrity, and reproducibility of scientific research." Brian Nosek and Jeffrey Spies founded the o ...
and its co-founder, Brian Nosek, who started the project in November 2011. The results of this collaboration were published in August 2015.
Reproducibility Reproducibility, also known as replicability and repeatability, is a major principle underpinning the scientific method. For the findings of a study to be reproducible means that results obtained by an experiment or an observational study or in ...
is the ability to produce the same findings, using the same methodologies as the original work, but on a different dataset (for instance, collected from a different set of participants). The project has illustrated the growing problem of failed reproducibility in social science. This project has started a movement that has spread through the science world with the expanded testing of the reproducibility of published works.


Results

Brian Nosek of
University of Virginia The University of Virginia (UVA) is a public research university in Charlottesville, Virginia. Founded in 1819 by Thomas Jefferson, the university is ranked among the top academic institutions in the United States, with highly selective adm ...
and colleagues sought out to replicate 100 different studies that all were published in 2008. The project pulled these studies from three different journals'',
Psychological Science ''Psychological Science'', the flagship journal of the Association for Psychological Science (APS), is a monthly, peer-reviewed, scientific journal published by SAGE Publications. Publication scope ''Psychological Science'' publishes research r ...
'', the ''
Journal of Personality and Social Psychology The ''Journal of Personality and Social Psychology'' is a monthly peer-reviewed scientific journal published by the American Psychological Association that was established in 1965. It covers the fields of social and personality psychology. The ed ...
'', and the '' Journal of Experimental Psychology: Learning, Memory, and Cognition'', published in 2008 to see if they could get the same results as the initial findings. In their initial publications 97 of these 100 studies claimed to have significant results. The group went through extensive measures to remain true to the original studies, including consultation with the original authors. Even with all the extra steps taken to ensure the same conditions of the original 97 studies, only 35 (36.1%) of the studies replicated, and if these effects were replicated, they were often smaller than those in the original papers. The authors emphasized that the findings reflect a problem that affects all of science and not just psychology, and that there is room to improve reproducibility in psychology. In 2021, the project showed that of 193 experiments from 53 top papers about cancer published between 2010 and 2012, only 50 experiments from 23 papers could get replicated. Moreover, it showed that the effect sizes of that fraction were 85% smaller on average than the original findings. None of the papers had its experimental protocols fully described and 70% of experiments required asking for key reagents.


Statistical relevance

Failure to replicate can have different causes. The first is a
type II error In statistical hypothesis testing, a type I error is the mistaken rejection of an actually true null hypothesis (also known as a "false positive" finding or conclusion; example: "an innocent person is convicted"), while a type II error is the fa ...
, which is when the null hypothesis fails to be rejected when it is false. This can be classified as a false negative. A
type I error In statistical hypothesis testing, a type I error is the mistaken rejection of an actually true null hypothesis (also known as a "false positive" finding or conclusion; example: "an innocent person is convicted"), while a type II error is the fa ...
is the rejection of a null hypothesis even if it is true, so this is considered a false positive.


Center for Open Science

The
Center for Open Science The Center for Open Science is a non-profit technology organization based in Charlottesville, Virginia with a mission to "increase the openness, integrity, and reproducibility of scientific research." Brian Nosek and Jeffrey Spies founded the o ...
was founded by Brian Nosek and Jeff Spies in 2013 with a $5.25 million grant from the
Laura and John Arnold Foundation Arnold Ventures LLC (formerly known as The Laura and John Arnold Foundation) is focused on evidence-based giving in a wide range of categories including: criminal justice, education, health care, and public finance. The organization was founded by ...
. By 2017 the Foundation had provided an additional $10 million in funding.


Outcome and importance

There have been multiple implications of the Reproducibility Project. People all over have started to question the legitimacy of scientific studies that have been published in esteemed journals. Journals typically only publish articles with big
effect sizes In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the ...
that reject the null hypothesis. Leading into the huge issue of people re-doing studies that have already found to fail, but not knowing because there is no record of the failed studies, which will lead to more false positives to be published. It is unknown if any of the original study authors committed fraud in publishing their projects, but some of the authors of the original studies are part of the 270 contributors of this project. One earlier study found that around $28 billion worth of research per year in
medical Medicine is the science and practice of caring for a patient, managing the diagnosis, prognosis, prevention, treatment, palliation of their injury or disease, and promoting their health. Medicine encompasses a variety of health care practic ...
fields is non-reproducible. The results of the Reproducibility Project might also affect public trust in psychology. Lay people who learned about the low replication rate found in the Reproducibility Project subsequently reported a lower trust in psychology, compared to people who were told that a high number of the studies had replicated.


See also

*
Invalid science Invalid science consists of scientific claims based on experiments that cannot be reproduced or that are contradicted by experiments that can be reproduced. Recent analyses indicate that the proportion of retracted claims in the scientific literatu ...
*
John Ioannidis John P. A. Ioannidis (; el, Ιωάννης Ιωαννίδης, ; born August 21, 1965) is a Greek-American physician-scientist, writer and Stanford University professor who has made contributions to evidence-based medicine, epidemiology, and ...
*
Meta-analysis A meta-analysis is a statistical analysis that combines the results of multiple scientific studies. Meta-analyses can be performed when there are multiple scientific studies addressing the same question, with each individual study reporting me ...
*
Metascience Metascience (also known as meta-research) is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "''research on research''" ...
* Proteus phenomenon *
Publication bias In published academic research, publication bias occurs when the outcome of an experiment or research study biases the decision to publish or otherwise distribute it. Publishing only results that show a significant finding disturbs the balance o ...
*
Replication crisis The replication crisis (also called the replicability crisis and the reproducibility crisis) is an ongoing methodological crisis in which the results of many scientific studies are difficult or impossible to reproduce. Because the reproducibili ...
* Scientific method


External link


Official website


References

{{Use dmy dates, date=October 2019 Validity (statistics) Scientific method Criticism of academia Metascience