For a very long time, replication studies had a lowly status in the hierarchy of publishable research. The equation of scientific advance with novelty and discovery relegated replication studies to second-class status in the scientific world. Over the past few years, this has fortunately been fast changing.

Credit: christitzeimaging.com / Alamy Stock Photo

Since the launch of this journal, we have wanted to redefine what constitutes a significant scientific advance (https://www.nature.com/articles/s41562-016-0033). Science moves forward not only through discovery, but also through confirmation or disconfirmation of existing findings that have shaped a field. We placed replication studies on the same level as studies that report new findings and adopted the following principle: if the original study was highly influential, then a study that replicates its methodology is of equal value.

Still, replication studies have been hard to find in our pages—they constitute a very small proportion of the submissions we receive. Nonetheless, the current issue, by design, features four replication studies, all of high value in terms of their contribution to the scientific record.

These studies share a key feature: they replicate the methodology of highly influential prior research. They are also all highly rigorous: one is a Registered Report, while the three studies that are published as regular Articles were preregistered, and the authors transparently report any deviations from their preregistered protocols.

The Registered Report by Declerck et al. (https://doi.org/10.1038/s41562-020-0878-x) is notable for its format and authorship: the authors carried out a highly powered replication of a study on oxytocin and trust that was produced by a subset of the current authors. Using the Registered Report format for replicating one’s own work takes courage and a high level of commitment to scientific integrity. Clearly, authors have a competing interest when it comes to their work, and the regular path to publication reduces confidence that the outcomes are reliable, despite authors’ best efforts at objectivity. By committing to the Registered Report format, the authors remove any doubt that their findings are credible. Declerck et al. did not confirm the original finding, but exploratory analyses suggested fruitful paths for further examining the potential effect of oxytocin on trust.

The replication study by Ruggeri et al. (https://doi.org/10.1038/s41562-020-0886-x) is notable for a different reason: unlike the other three projects, the authors confirmed the findings of the enormously influential original work they set out to replicate. There was already evidence in the field that the findings of prospect theory are reliable. However, this article, by adopting a high-powered, multicultural approach, provides conclusive evidence of the reliability of prospect theory in 19 different countries. Many influential findings have recently been questioned, and a new concerning form of publication bias is emerging: replication studies that do not confirm the original findings are being perceived as more newsworthy and hence as more publication-worthy. We do not share this view; we committed to publishing the Registered Report by Declerck et al. regardless of the direction of the results; and we published the Article by Ruggeri et al. despite that fact that it does not contain a possibly more newsworthy negative outcome.

The articles by Bakker et al. (https://doi.org/10.1038/s41562-020-0823-z) and Gluth et al. (https://doi.org/10.1038/s41562-020-0822-0) are notable for their approaches: Bakker et al. carried out both a direct replication of the original study and two conceptual replications, providing converging evidence that the original finding they tested is not supported; Gluth et al. combined a replication study with a new experiment, along with computational modelling, to provide an alternative account of the relationship between value, attention, response times and decisions than an influential earlier study had provided.

‘Replication failure’ has become a common catchphrase. But the four replication studies published in this issue demonstrate that robust, rigorous replication efforts do not fail: they invariably succeed at strengthening our body of knowledge and moving science forward.

This issue is a celebration of replication, and we hope that, by featuring these contributions together and writing this editorial, other scientists will also be inspired to devote their efforts in replication projects and more funders will be inclined to prioritize the funding of replication research. We are certainly looking forward to receiving and publishing many more replications in our pages.