menu

Submission

submission voting
voting is closed.
introduction
title
Learning about and enabling reproducibility
short description
Studying reproducibility both reuses original data, code, and materials and is an opportunity to increase sharing of outputs.
Submission Details
Please complete these prompts for your round one submission.
Submission Category
Data sharing
Abstract / Overview

We conducted a large-scale systematic investigation into the replicability of high-impact, preclinical cancer biology experiments. Overall, we had challenges at every stage of the research process to design and conduct the replications. It was hard to understand what was originally done, we could not always get access to the original data or reagents to conduct the experiments, and model systems frequently did not behave as originally reported. The limited transparency and incomplete reporting made the efforts to replicate the findings much harder than was necessary. When reporting the replication results we made the process as open as possible to enable discovery, reproducibility, and reuse.

Team

The team were employees at the Center for Open Science (COS) and Science Exchange (SE). We came together to assess the replicability of preclinical cancer biology research as openly as possible. We coordinated the replication efforts working with original authors to understand and gain access to original data, code, materials, and methodology. We also coordinated with researchers (identified from the Science Exchange network) to understand the approach they would take for the replications and how to make their data, code, materials, and methodology as open as possible. The support included submissions of protocols and final reports to the open access journal eLife for review, depositing all digital materials (e.g., data, code), and physical materials in repositories (e.g., addgene). We divided responsibilities according to organizational objectives. Specifically, Science Exchange staff assisted in identifying labs and coordinating with them regarding operational aspects and Center for Open Science staff on sharing practices and report writing and analyses. The project was designed to be a collaborative endeavor. Ultimately, 200 individuals contributed in some way to complete this project (https://www.cos.io/rpcb-contributors).

Potential Impact

The project was an eight year effort including 200 researchers that was transparently conducted to allow community engagement in the project and the findings. Specifically, for each paper being replicated the detailed protocols for the replication experiments were written up as a Registered Report and submitted to the open-access journal eLife for peer review; moreover, work on the replication experiments could not begin until the Registered Report had been accepted for publication. The completed replication experiments were then written up as a Replication Study, peer reviewed and published in eLife. In addition to increased transparency, this approach also increased the rigor of the replications by obtaining input early in the process. At the end of last year (2021) two capstone papers were published summarizing the entire project. One of the papers reported on the challenges confronted when preparing and conducting replication, such as insufficient detail to design a replication without seeking clarifications from the original authors, of which some (26%) were extremely helpful and generous with feedback, others (32%) were not at all helpful or did not respond to requests. The second paper reported a meta-analysis of the results of the completed replication experiments. Replications provided much weaker evidence for the findings compared to the original experiments. For example, for original positive results, replication effect sizes were 85% smaller than the original effect sizes on average. The amount of transparency demonstrated throughout the project was a huge success considering a motivation for conducting the project, was the lack of transparency in the reports by Bayer and Amgen.

The collections for eLife (https://elifesciences.org/collections/9b1e83d1/reproducibility-project-cancer-biology) and OSF (https://osf.io/collections/rpcb) allow others to search, explore, and reuse the content generated during this project.

The approaches taken with this project enabled discovery, engagement, and access to materials throughout the project. The benefit is enabling contribution at all stages (e.g., protocol development) instead of just a single paper contribution at the end. This also allowed feedback formally through peer review and informally through open access sharing of content. Besides helping ourselves in the coordination of multiple project occurring at the same time, it also enables linking content to papers for increased engagement.

Replicability

We leveraged the use of Registered Reports where the protocols, sample size calculations, etc were peer reviewed prior to the begin of experimentation. In addition to increasing rigor of the replication design, it also meant there was transparent accountability to reporting all outcomes. This approach can be leveraged by others now (for full list of journals accepting this format: http://cos.io/rr). Additionally we deposited all data, code, and digital materials on the OSF (https://osf.io/collections/rpcb) and physical materials in open repositories (e.g., addgene). To increase discoverability these services provide collections or other ways to identify all outputs related to this effort, which can be used by others. We also provided links to the associated data, code, and other related materials directly in the published papers. For example, in each figure reporting an experiment, the figure legend with have a link to the OSF project where one could find all the underlying data, code, and digital materials. This approach enables others to identify these outputs easily (see last slides here (https://osf.io/zvda5) for illustration). These approaches can be replicated by others during manuscript preparation.

Potential for Community Engagement and Outreach

This project identified substantial challenges for cancer research, but they occur amid a reformation in science to address dysfunctional incentives, improve the research culture, increase transparency and sharing, and improve rigor in design and conduct of research. Science is at its best when it confronts itself and identifies ways to improve the quality and credibility of research findings. The progress that has been made in research despite the challenges described in this project suggests there is room for improvement. The Reproducibility Project: Cancer Biology is one contribution in an ongoing self-examination of research practices and opportunities for improvement. Future success of this project will be for it to continue to spur innovation in how to advance the efficiency of scientific progress.

The benefits for the individual researcher is multi-fold. It helps yourself, especially your future self, when you need to go back and revisit, reuse, or build off of your own prior work. It helps others who will help advance science faster by not having to remake or redo the steps you have already done. Collectively we can make more progress than we can individually, but it takes each of us to individually make that happen.

Supporting Information (Optional)
Include links to relevant and publicly accessible website page(s), up to three relevant publications, and/or up to five relevant resources.
Supporting Documentation 01
https://www.cos.io/rpcb
Supporting Documentation 03
https://osf.io/collections/rpcb
Supporting Documentation 04
https://osf.io/an4sj/

comments (public)