The last ten years have witnessed increasing awareness of questionable research practices …
The last ten years have witnessed increasing awareness of questionable research practices (QRPs) in the life sciences, including p-hacking, HARKing, lack of replication, publication bias, low statistical power and lack of data sharing (see Figure 1). Concerns about such behaviours have been raised repeatedly for over half a century but the incentive structure of academia has not changed to address them. Despite the complex motivations that drive academia, many QRPs stem from the simple fact that the incentives which offer success to individual scientists conflict with what is best for science. On the one hand are a set of gold standards that centuries of the scientific method have proven to be crucial for discovery: rigour, reproducibility, and transparency. On the other hand are a set of opposing principles born out of the academic career model: the drive to produce novel and striking results, the importance of confirming prior expectations, and the need to protect research interests from competitors. Within a culture that pressures scientists to produce rather than discover, the outcome is a biased and impoverished science in which most published results are either unconfirmed genuine discoveries or unchallenged fallacies. This observation implies no moral judgement of scientists, who are as much victims of this system as they are perpetrators.
We present a consensus-based checklist to improve and document the transparency of …
We present a consensus-based checklist to improve and document the transparency of research reports in social and behavioural research. An accompanying online application allows users to complete the form and generate a report that they can submit with their manuscript or post to a public repository.
Improving the reliability and efficiency of scientific research will increase the credibility …
Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals of these measures, and how they can be implemented, in the hope that this will facilitate action toward improving the transparency, reproducibility and efficiency of scientific research.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.