Search for rigour

The take-away

  • Successful replication is the best way to confirm the validity of scientific research.
  • A string of high-profile problems has raised serious questions about published research.

When Nature surveyed 1,576 researchers in 2016, more than 70% had at some point in their career tried and failed to reproduce another scientist’s experiment, while more than half failed to reproduce their own results. That’s a major problem because replication is a cornerstone of science: a study yields results which are then taken up by others and retested. Successful replication helps confirm validity; failure can suggest further scrutiny is needed. At least, that’s the theory.

“We need to incentivize methodological rigour and robust data analysis over results,” says Michèle Nuijten, an expert in meta-science from Tilburg University in the Netherlands. Unfortunately, headline-grabbing breakthroughs are more attractive to journals, so failed replication studies often go unpublished: only 13% of Nature’s respondents had this type of paper accepted.

Michèle Nuijten, expert in meta-science from Tilburg University

“We need to incentivize methodological rigour and robust data analysis over results.”

A string of high-profile replication problems has forced scientists to rethink how they evaluate results. One notable admission came from pharmaceuticals giant Bayer in 2011, when it revealed that two-thirds of in-house studies identifying possible drug targets can’t be replicated. In 2015, psychologist Brian Nosek made further waves by publishing the much-anticipated results of his Reproducibility Project, in which a group of researchers attempted to replicate 100 notable psychology papers published in 2008. Just 36 proved replicable. In the same year, PLoS Biology suggested that the US spends around $28 billion each year on research that can’t be replicated.

Some implicated scientists have felt publicly shamed, others have insinuated incompetence by replicators. But many more see an opportunity to improve data analysis. “Openness is key,” says Nuijten. “Several journals now enforce policies of data sharing, which turns out to be a highly effective way to drive compliance among researchers.” Funders in Nuijten’s homeland have taken the incentive to replicate further. In 2016, the Netherlands Organisation for Scientific Research (NWO) launched the first grants programme dedicated to replication, worth €3 million.

“That’s a relatively small amount,” says Nuijten “but the announcement generated a lot of interest and sent a positive signal that funding agencies may be willing to invest in this vital type of research.”


Posted

in

by

Tags: