Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology

Neal R. Haddaway*, Jos T A Verhoeven

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeatability of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (±8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeatability as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.

Original languageEnglish
Pages (from-to)4451-4454
Number of pages4
JournalEcology and Evolution
Volume5
Issue number19
DOIs
Publication statusPublished - 1 Oct 2015

Keywords

  • Evidence synthesis
  • Experimental design
  • Meta-analysis
  • Reliability
  • Research legacy
  • Susceptibility to bias
  • Systematic review
  • Transparency

Fingerprint

Dive into the research topics of 'Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology'. Together they form a unique fingerprint.

Cite this