Updating search results...

Reproducibility

Agreement of research results repeated. Reproducibility, replicability, repeatability, robustness, generalizability, organization, documentation, automation, dissemination, guidance, definitions, and more.

185 affiliated resources

Search Resources

View
Selected filters:
Raiders of the lost HARK: a reproducible inference framework for big data science
Unrestricted Use
CC BY
Rating
0.0 stars

Hypothesizing after the results are known (HARK) has been disparaged as data dredging, and safeguards including hypothesis preregistration and statistically rigorous oversight have been recommended. Despite potential drawbacks, HARK has deepened thinking about complex causal processes. Some of the HARK precautions can conflict with the modern reality of researchers’ obligations to use big, ‘organic’ data sources—from high-throughput genomics to social media streams. We here propose a HARK-solid, reproducible inference framework suitable for big data, based on models that represent formalization of hypotheses. Reproducibility is attained by employing two levels of model validation: internal (relative to data collated around hypotheses) and external (independent to the hypotheses used to generate data or to the data used to generate hypotheses). With a model-centered paradigm, the reproducibility focus changes from the ability of others to reproduce both data and specific inferences from a study to the ability to evaluate models as representation of reality. Validation underpins ‘natural selection’ in a knowledge base maintained by the scientific community. The community itself is thereby supported to be more productive in generating and critically evaluating theories that integrate wider, complex systems.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
Palgrave Communications
Author:
Iain E. Buchan
James S. Koopman
Jiang Bian
Matthew Sperrin
Mattia Prosperi
Mo Wang
Date Added:
08/07/2020
Rate and success of study replication in ecology and evolution
Unrestricted Use
CC BY
Rating
0.0 stars

The recent replication crisis has caused several scientific disciplines to self-reflect on the frequency with which they replicate previously published studies and to assess their success in such endeavours. The rate of replication, however, has yet to be assessed for ecology and evolution. Here, I survey the open-access ecology and evolution literature to determine how often ecologists and evolutionary biologists replicate, or at least claim to replicate, previously published studies. I found that approximately 0.023% of ecology and evolution studies are described by their authors as replications. Two of the 11 original-replication study pairs provided sufficient statistical detail for three effects so as to permit a formal analysis of replication success. Replicating authors correctly concluded that they replicated an original effect in two cases; in the third case, my analysis suggests that the finding by the replicating authors was consistent with the original finding, contrary the conclusion of “replication failure” by the authors.

Subject:
Biology
Ecology
Life Science
Material Type:
Reading
Provider:
PeerJ
Author:
Clint D. Kelly
Date Added:
08/07/2020
Recommendations for Increasing Replicability in Psychology: Recommendations for increasing replicability
Unrestricted Use
CC BY
Rating
0.0 stars

Replicability of findings is at the heart of any empirical science. The aim of this article is to move the current replicability debate in psychology towards concrete recommendations for improvement. We focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations. The challenges for improving replicability in psychological science are systemic. Improvement can occur only if changes are made at many levels of practice, evaluation, and reward.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
European Journal of Personality
Author:
Brent W. Roberts
Brian A. Nosek
David C. Funder
Filip De Fruyt
Hannelore Weber
Jaap J. A. Denissen
Jan De Houwer
Jelte M. Wicherts
Jens B. Asendorpf
Klaus Fiedler
Manfred Schmitt
Marcel A. G. van Aken
Marco Perugini
Mark Conner
Reinhold Kliegl
Susann Fiedler
Date Added:
08/07/2020
Replicability and Reproducibility in Comparative Psychology
Unrestricted Use
CC BY
Rating
0.0 stars

Replicability and Reproducibility in Comparative Psychology Psychology faces a replication crisis. The Reproducibility Project: Psychology sought to replicate the effects of 100 psychology studies. Though 97% of the original studies produced statistically significant results, only 36% of the replication studies did so (Open Science Collaboration, 2015). This inability to replicate previously published results, however, is not limited to psychology (Ioannidis, 2005). Replication projects in medicine (Prinz et al., 2011) and behavioral economics (Camerer et al., 2016) resulted in replication rates of 25 and 61%, respectively, and analyses in genetics (Munafò, 2009) and neuroscience (Button et al., 2013) question the validity of studies in those fields. Science, in general, is reckoning with challenges in one of its basic tenets: replication. Comparative psychology also faces the grand challenge of producing replicable research. Though social psychology has born the brunt of most of the critique regarding failed replications, comparative psychology suffers from some of the same problems faced by social psychology (e.g., small sample sizes). Yet, comparative psychology follows the methods of cognitive psychology by often using within-subjects designs, which may buffer it from replicability problems (Open Science Collaboration, 2015). In this Grand Challenge article, I explore the shared and unique challenges of and potential solutions for replication and reproducibility in comparative psychology.

Subject:
Economics
Psychology
Social Science
Material Type:
Reading
Provider:
Frontiers in Psychology
Author:
Jeffrey R. Stevens
Date Added:
08/07/2020
Reporting in Experimental Philosophy: Current Standards and Recommendations for Future Practice
Unrestricted Use
CC BY
Rating
0.0 stars

Recent replication crises in psychology and other fields have led to intense reflection about the validity of common research practices. Much of this reflection has focussed on reporting standards, and how they may be related to the questionable research practices that could underlie a high proportion of irreproducible findings in the published record. As a developing field, it is particularly important for Experimental Philosophy to avoid some of the pitfalls that have beset other disciplines. To this end, here we provide a detailed, comprehensive assessment of current reporting practices in Experimental Philosophy. We focus on the quality of statistical reporting and the disclosure of information about study methodology. We assess all the articles using quantitative methods (n = 134) that were published over the years 2013–2016 in 29 leading philosophy journals. We find that null hypothesis significance testing is the prevalent statistical practice in Experimental Philosophy, although relying solely on this approach has been criticised in the psychological literature. To augment this approach, various additional measures have become commonplace in other fields, but we find that Experimental Philosophy has adopted these only partially: 53% of the papers report an effect size, 28% confidence intervals, 1% examined prospective statistical power and 5% report observed statistical power. Importantly, we find no direct relation between an article’s reporting quality and its impact (numbers of citations). We conclude with recommendations for authors, reviewers and editors in Experimental Philosophy, to facilitate making research statistically-transparent and reproducible.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Review of Philosophy and Psychology
Author:
Andrea Polonioli
Brittany Blankinship
David Carmel
Mariana Vega-Mendoza
Date Added:
08/07/2020
ReproducibiliTea
Read the Fine Print
Rating
0.0 stars

Everything you need to know about this ECR-led journal club initiative that helps early career researchers create local Open Science groups that discuss issues, papers and ideas to do with improving science.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
UK Reproducibility Network
Date Added:
06/18/2020
Reproducibility Immersive Course
Conditional Remix & Share Permitted
CC BY-SA
Rating
0.0 stars

Various fields in the natural and social sciences face a ‘crisis of confidence’. Broadly, this crisis amounts to a pervasiveness of non-reproducible results in the published literature. For example, in the field of biomedicine, Amgen published findings that out of 53 landmark published results of pre-clinical studies, only 11% could be replicated successfully. This crisis is not confined to biomedicine. Areas that have recently received attention for non-reproducibility include biomedicine, economics, political science, psychology, as well as philosophy. Some scholars anticipate the expansion of this crisis to other disciplines.This course explores the state of reproducibility. After giving a brief historical perspective, case studies from different disciplines (biomedicine, psychology, and philosophy) are examined to understand the issues concretely. Subsequently, problems that lead to non-reproducibility are discussed as well as possible solutions and paths forward.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
New York University
Author:
Vicky Steeves
Date Added:
06/01/2018
Reproducibility Librarianship in Practice
Unrestricted Use
CC BY
Rating
0.0 stars

As research across domains of study has become increasingly reliant on digital tools (librarianship included), the challenges in reproducibility have grown. Alongside this reproducibility challenge are the demands for open scholarship, such as releasing code, data, and articles under an open license.Before, researchers out in the field used to capture their environments through observation, drawings, photographs, and videos; now, researchers and the librarians who work alongside them must capture digital environments and what they contain (e.g. code and data) to achieve reproducibility. Librarians are well-positioned to help patrons open their scholarship, and it’s time to build in reproducibility as a part of our services.Librarians are already engaged with research data management, open access publishing, grant compliance, pre-registration, and it’s time we as a profession add reproducibility to that repertoire. In this webinar, organised by LIBER’s Research Data Management Working Group, speaker Vicky Steeves discusses how she’s built services around reproducibility as a dual appointment between the Libraries and the Center for Data Science at New York University.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
New York University
Author:
Birgit Schmidt
Vicky Steeves
Date Added:
12/04/2018
Reproducibility, Preservation, and Access to Research with ReproZip and ReproServer
Conditional Remix & Share Permitted
CC BY-SA
Rating
0.0 stars

The adoption of reproducibility remains low, despite incentives becoming increasingly common in different domains, conferences, and journals. The truth is, reproducibility is technically difficult to achieve due to the complexities of computational environments.To address these technical challenges, we created ReproZip, an open-source tool that packs research along with all the necessary information to reproduce it, including data files, software, OS version, and environment variables. Everything is then bundled into an .rpz file, which users can use to reproduce the work with ReproUnzip and an unpacker (Docker, Vagrant, and Singularity). The .rpz file is general and contains rich metadata: more unpackers can be added as needed, better guaranteeing long-term preservation.However, installing the unpackers can still be burdensome for secondary users of ReproZip bundles. In this paper, we will discuss how ReproZip and our new tool ReproServer can be used together to facilitate access to well-preserved, reproducible work. ReproServer is a cloud application that allows users to upload or provide a link to a ReproZip bundle, and then interact with/reproduce the contents from the comfort of their browser. Users are then provided a stable link to the unpacked work on ReproServer they can share with reviewers or colleagues.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
New York University
Author:
Fernando Chirigati
Rémi Rampin
Vicky Steeves
Date Added:
05/31/2019
Reproducibility in Cancer Biology: The challenges of replication
Unrestricted Use
CC BY
Rating
0.0 stars

Interpreting the first results from the Reproducibility Project: Cancer Biology requires a highly nuanced approach. Reproducibility is a cornerstone of science, and the development of new drugs and medical treatments relies on the results of preclinical research being reproducible. In recent years, however, the validity of published findings in a number of areas of scientific research, including cancer research, have been called into question (Begley and Ellis, 2012; Baker, 2016). One response to these concerns has been the launch of a project to repeat selected experiments from a number of high-profile papers in cancer biology (Morrison, 2014; Errington et al., 2014). The aim of the Reproducibility Project: Cancer Biology, which is a collaboration between the Center for Open Science and Science Exchange, is two-fold: to provide evidence about reproducibility in preclinical cancer research, and to identify the factors that influence reproducibility more generally.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
eLife
Author:
eLife Editors
Date Added:
08/07/2020
Reproducible Research
Read the Fine Print
Rating
0.0 stars

Modern scientific research takes advantage of programs such as Python and R that are open source. As such, they can be modified and shared by the wider community. Additionally, there is added functionality through additional programs and packages, such as IPython, Sweave, and Shiny. These packages can be used to not only execute data analyses, but also to present data and results consistently across platforms (e.g., blogs, websites, repositories and traditional publishing venues).

The goal of the course is to show how to implement analyses and share them using IPython for Python, Sweave and knitr for RStudio to create documents that are shareable and analyses that are reproducible.

Course outline is as follows:
1) Use of IPython notebooks to demonstrate and explain code, visualize data, and display analysis results
2) Applications of Python modules such as SymPy, NumPy, pandas, and SciPy
3) Use of Sweave to demonstrate and explain code, visualize data, display analysis results, and create documents and presentations
4) Integration and execution of IPython and R code and analyses using the IPython notebook

Subject:
Applied Science
Information Science
Material Type:
Full Course
Author:
Christopher Ahern
Date Added:
08/07/2020
Reproducible Research Methods
Read the Fine Print
Rating
0.0 stars

This is the website for the Autumn 2014 course “Reproducible Research Methods” taught by Eric C. Anderson at NOAA’s Southwest Fisheries Science Center. The course meets on Tuesdays and Thursdays from 3:30 to 4:30 PM in Room 188 of the Fisheries Ecology Division.
It runs from Oct 7 to December 18.

The goal of this course is for scientists, researchers, and students to learn:

to write programs in the R language to manipulate and analyze data,
to integrate data analysis with report generation and article preparation using knitr,
to work fluently within the Rstudio integrated development environment for R,
to use git version control software and GitHub to effectively manage source code, collaborate efficiently with other researchers, and neatly package their research.

Subject:
Applied Science
Information Science
Material Type:
Full Course
Author:
Eric C. Anderson
Date Added:
08/07/2020
Reproducible Research: Walking the Walk
Read the Fine Print
Rating
0.0 stars

Description

This hands-on tutorial will train reproducible research warriors on the practices and tools that make experimental verification possible with an end-to-end data analysis workflow. The tutorial will expose attendees to open science methods during data gathering, storage, analysis, up to publication into a reproducible article.

Attendees are expected to have basic familiarity with scientific Python and Git.

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Matt McCormick
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Automation
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
François Michonneau
Kim Gilbert
Matt Pennell
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Literate Programming
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Ciera Martinez
Courtney Soderberg
Hilmar Lapp
Jennifer Bryan
Kristina Riemer
Naupaka Zimmerman
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Organization
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Ciera Martinez
Courtney Soderberg
Hilmar Lapp
Jennifer Bryan
Kristina Riemer
Naupaka Zimmerman
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Publication
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Dave Clements
Hilmar Lapp
Karen Cranston
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Version Control
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Ciera Martinez
Hilmar Lapp
Karen Cranston
Date Added:
08/07/2020
Reproducible Science Workshop
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Dan Leehr
Date Added:
08/07/2020