While researchers have identified several practices that lead to research waste and low reproducibility
1 over the past few decades, these problems largely remain unresolved. For example, an estimated 85% of clinical research funding is wasted (
Chalmers and Glasziou 2009;
Glasziou and Chalmers 2016) through nonpublication (
Riedel et al. 2021;
Ross et al. 2012;
Wieschowski et al. 2019); lack of clarity, completeness, and accuracy in published reports (
Glasziou et al. 2014); and flaws in study design (
Yordanov et al. 2015). Beyond clinical research, large-scale replication attempts have revealed low reproducibility of methods and results in disciplines as diverse as psychology (
Open Science Collaboration 2015), cancer biology (
Errington et al. 2021), economics (
Camerer et al. 2016), and water resource management (
Stagge et al. 2019). Research waste and low reproducibility arise in part because sharing protocols, data, and analysis scripts remains extremely rare (e.g., in psychology (
Hardwicke et al. 2020a), social sciences (
Hardwicke et al 2020b), and biomedicine (
Iqbal et al. 2016)), and because researchers regularly misunderstand (
Gigerenzer 2004;
Hoekstra et al. 2014;
Lyu et al. 2020) and misapply (e.g.,
Nieuwenhuis et al. 2011) common statistical methods. In contrast to research misconduct—which occurs when an individual fabricates data, falsifies the research record, or plagiarizes the work of others—research waste and irreproducibility are widespread systemic issues with pernicious impacts. Fortunately, improvements in reproducibility also make misconduct more difficult to commit and easier to detect. Taken together, these issues undermine the trustworthiness and value of research. In a largely publicly funded research environment, such as Canada’s, this waste is all the more problematic.
Researchers aren’t the only ones responsible for rigour
2 and reproducibility. These issues are embedded in a complex research ecosystem that includes various parties with similar end-goals, but diverging incentives and proximate goals. Universities want to rank high in league tables and tend to hire and promote researchers based on journal Impact Factor and grant funding, whilst overlooking open and reproducible research practices such as data sharing and protocol registration (
Rice et al. 2020). Publishers want to attract readers and citations and generally prefer striking findings (
SAGE Publishing 2015;
Mlinarić et al. 2017;
Nature Publishing Group 2021) which can encourage questionable research practices (
Fiedler and Schwarz 2016;
Fraser et al. 2018;
Gopalakrishna et al. 2021) and spin (
Jellison et al. 2020). Some regulators develop policies to counter these issues (e.g.,
FDA 2016), but they often fail to monitor for compliance or provide the infrastructure necessary to meet the requirements (
EBM DataLab 2018;
Scaffidi et al. 2021;
TARG Meta-Research Group & Collaborators 2021). Simply telling researchers how to do rigorous and reproducible research is not enough. A network approach that coordinates the incentives and proximate goals of researchers, funders, publishers, institutions, learned societies, regulators, and other stakeholders towards the common end-goal of maximizing the value of research could work best. This approach requires training of researchers (e.g., workshops in open science), resources to implement best practices (e.g., data management staff or data champions), the development and use of user-friendly tools (e.g., data registries, experimental design assistants), regulations that come with audits and feedback, and a common understanding around the importance of rigour and reproducibility.
In a few countries, these coordinated efforts are underway. The United Kingdom has led the charge with several national level reports surrounding research culture (
Nuffield Council on Bioethics 2014;
Wilsdon et al. 2015;
House of Commons Science and Technology Committee 2018;
Vitae UK 2020) and a recent parliamentary inquiry into reproducibility and research integrity (
UK Parliament 2021). In parallel, British researchers founded the UK Reproducibility Network (UKRN;
ukrn.org). Launched in 2019, this network now has local networks at more than 50 universities, over 20 institutions that have formally joined by creating a senior academic lead role focused on research improvement, and external stakeholders including funders (e.g., UK Research and Innovation, Research England, Wellcome), learned societies (e.g., British Psychological Society), and publishers (e.g., Nature Publishing Group, Wiley). They have developed and delivered training programs on open research across the United Kingdom and have worked with researchers, institutions, and stakeholders to coordinate efforts to improve research quality. Their unified voice for reproducibility led to the recent award of £4.5M by Research England—a “major strategic investment” intended to drive the uptake of open research practices. These achievements speak to the power of a coordinated approach that provides a voice for researchers themselves.
Other countries have developed their own Reproducibility Networks including Australia, Finland, Germany, Italy, Portugal, Slovakia, and Switzerland. A handful of countries also have organizations that serve as hubs for research rigour and reproducibility such as the Association for Interdisciplinary Meta-Research and Open Science (AIMOS) in Australia, the Center for Open Science (COS) and Meta-Research Innovation Center at Stanford (METRICS) in the United States, the QUEST Center for Responsible Research in Germany, the Research on Research Institutes in the United Kingdom and the Netherlands, and the BRIGHTER Meta-Research Group in Brazil. Canadian researchers and organizations have expressed interest in these topics (e.g., the Centre for Journalology at the Ottawa Health Research Institute), but we lack a more formal structure to tackle these issues as a nation.
Canada punches above its weight in terms of the quantity of research output (
Nature Index 2021;
World Bank 2018), but we, like other countries, remain susceptible to the shortcomings discussed earlier. Canadian funders and institutions lag behind in terms of reporting clinical trial results (
Cobey et al. 2017), and Canadian universities use hiring and promotion criteria that overlook practices such as data sharing, open access publishing, study registration, and use of reporting guidelines (
Rice et al. 2021). Some organizations aim to address these problems, for example, the recent Tri-Agency data management policy will soon require grant recipients to deposit their data in a digital repository (
Government of Canada 2021). While this policy is a step forward, a clear roadmap for how to implement this policy is absent. It will require training, resources, financial support, and auditing for compliance (
Moher and Cobey 2021). At the moment, Canadian funders and universities lack publicly available data regarding compliance with their own policies in open access publishing, study registration, and data management. This shortcoming becomes prominent when considering the Canadian Government’s dedication to Open Government, which includes a specific commitment to open science (
Government of Canada 2016). Meta-research specific to the Canadian research environment remains limited and would help elucidate the best paths forward.
By emulating national organizations such as the UKRN, and the other initiatives mentioned, Canadians can accelerate our progress towards more rigorous and reproducible research. We can increase our attractiveness for international collaborations and international funding competitions. We can create a research culture that aligns stakeholders in the Canadian research ecosystem towards the common good of available, interpretable, and trustworthy research. The Canadian public, including patients and other end-users of research findings, would surely welcome such advances.
If you or your organization is interesting in being part of such a network, please email the corresponding author.
Funding
Robert Thibault is supported by a general support grant awarded to METRICS from the Laura and John Arnold Foundation and a postdoctoral fellowship from the Fonds de recherche du Québec – Santé, Canada. The funders had no role in the preparation of the manuscript or decision to publish.