Protocol Problems: Figuring Out How an Experiment Was Done
Back to Newsroom

Protocol Problems: Figuring Out How an Experiment Was Done

Author(s)
  • ASAP
    Program Officer

    Dana Lewis, PhD

    Aligning Science Across Parkinson’s (ASAP) | USA

    Dr. Dana Lewis is a Program Officer at the Coalition for Aligning Science (CAS) and Aligning Science Across Parkinson’s (ASAP), a basic science initiative aimed at unraveling the etiology of Parkinson’s disease. Dr. Lewis earned her PhD in Neuroscience from George Washington University in the laboratory of Dr. Zayd Khaliq at the National Institutes of Neurological Disorders and Stroke. She completed her postdoctoral work at the Johns Hopkins University School of Medicine in the laboratory of Dr. Maya Opendak at the Kennedy Krieger Institute. As a graduate student and postdoctoral fellow, Dr. Lewis’ research focused on connecting neurophysiological measurements of mesolimbic circuits with behavior and biomarkers of disease. In addition to her expertise in neurophysiology and systems neuroscience, Dr. Lewis is passionate about facilitating science communication to scientists and nonscientists alike, and has served as editor of an undergraduate research journal and scientific community newsletter, education consultant for a patient-focused nonprofit, lecturer, and has served on numerous committees focused on communication and dissemination of science.

Repeating the experiment of a former lab member is a common rite of passage for biomedical graduate students. If the former lab member was meticulous, providing detailed instructions, this request may pose little to no issue for the current student. However, if very little information about the steps of the experiment are available, students may find themselves hitting a major roadblock. Now, extend this scenario to researchers who want to repeat an experiment that was originally conducted in a different lab. Replicating the experiment may prove extremely challenging. Replication, when the result from a study is consistent with previous studies, is the gold standard of science and a key means of building confidence and credibility in science.1 

Our goal is to ensure that the experiments that comprise ASAP-funded research are replicable. Below we make the case for why ASAP requires a recipe-style protocol or Methods paper to be shared for every Methods section in a manuscript. 

Less Than Half of Experiments Can Be Replicated

The Reproducibility Project: Cancer Biology set out to investigate the replicability of top findings in preclinical cancer biology research.2-3 They began with a set of 193 experiments to perform, selected from a set of high-impact papers published between 2010-2023. But by the end of the eight-year endeavor, only 26% (50) of these experiments were able to be replicated.3 

“…barriers include[d]: shortcomings in documentation of the original methodology; failures of transparency in original findings and protocols; failures to share original data, reagents, and other materials; methodological challenges encountered during the execution of the replication experiments. These challenges meant that we only completed 50 of the 193 experiments (26%) we planned to repeat.3

This was due, in large part, to a severe lack of methodological detail. None of the 193 experiments were explained in enough detail in the publication to permit replication of experiments.3 Even after further clarification from authors, 59% of protocols required further modifications and iterations of methods in order to complete the experiment.3

  • 0% of experiments contained enough detail in the manuscript for replication
  • 70% of experiments required more information about key reagents
  • 59% of protocols required further modifications to complete
  • 26% of planned experiments were able to be replicated

There are four aspects to consider when discussing methods used in a paper: 

  1. Convey variability: Replicable papers convey the often hidden variability (i.e., range of error) inherent to a method or protocol. For example, explaining the number of animals that must be inoculated to get a successful cohort, or detailing the number of replicates necessary to achieve a desired result.  
  2. Link out to detailed protocols: Replicable papers link out to protocols.io or a published Methods paper that describes the protocol in detail rather than providing a brief overview of the protocol in the Methods section or citing other papers that have previously used a similar method. 
  3. Share details that are population- or mechanism-based: Replicable papers share protocols that are focused on population- or mechanism-based methods instead of time-based methods. For example, if a protocol states to wait 24 hours after seeding, share instead the empirical reason behind that (e.g., wait one cycle of division).
  4. Share all reagents with RRIDs: Research is not replicable and reproducible if all research resources are not properly shared.

ASAP’s Approach to Replicability

When protocols are not shared in public repositories, the methodologies in the associated papers may not include sufficient detail for replication. In an effort to improve the replicability of ASAP-funded research, we require grantees to share a recipe-style protocol or Methods paper for every Methods section in a manuscript. Assessment of protocol availability for ASAP-funded publications revealed that implementation of this policy improves protocol sharing at the time of publication when compared to the first ASAP-funded publications. 

In the first year of ASAP’s policy implementation, 34.9% of protocols were shared at protocols.io in a recipe-style format or were linked to detailed Methods papers. By contrast, in 2024, 62.3% of protocols in ASAP-funded publications were shared at protocols.io in a recipe-style format or were linked to detailed Methods papers, a 78.6% increase (34.9% to 62.3%) in protocol sharing.

Learn More About Protocol Sharing and the ASAP Open Science Policy

References

[1] National Academies of Science, Engineering, and Medicine. Reproducibility and replicability in science. The National Academies Press (2019). https://nap.nationalacademies.org/catalog/25303/reproducibility-and-replicability-in-science

[2] Errington et al. Reproducibility in cancer biology: Challenges for assessing replicability in preclinical cancer biology. eLife 10: e67995 (2021). https://pubmed.ncbi.nlm.nih.gov/34874008/

[3] Errington et al. Investigating the replicability of preclinical cancer biology. eLife 10: e71601 (2021). https://pubmed.ncbi.nlm.nih.gov/34874005/