This resource is a video abstract of a research paper created by …
This resource is a video abstract of a research paper created by Research Square on behalf of its authors. It provides a synopsis that's easy to understand, and can be used to introduce the topics it covers to students, researchers, and the general public. The video's transcript is also provided in full, with a portion provided below for preview:
"In microbiome research, quality control is essential to the repeatability and reproducibility of results. This is especially true for samples with a low bacterial load such as breast milk, where samples can easily become contaminated by reagents. In a new study, researchers propose a framework for an approach to address this challenge. The framework consists of three independent stages: 1) Verification of sequencing accuracy by assessing technical repeatability and reproducibility, 2) Contaminant removal and batch variability correction, and 3) Corroborating the repeatability and reproducibility of the microbiome composition and downstream analysis. The approach was validated using milk microbiota data from the CHILD Cohort, generated in two batches in 2016 and 2019. The framework helped to identify potential contaminant reagents that were missed with standard algorithms, substantially reducing contaminant-induced batch variability..."
The rest of the transcript, along with a link to the research itself, is available on the resource itself.
Replicability and Reproducibility in Comparative Psychology Psychology faces a replication crisis. The …
Replicability and Reproducibility in Comparative Psychology Psychology faces a replication crisis. The Reproducibility Project: Psychology sought to replicate the effects of 100 psychology studies. Though 97% of the original studies produced statistically significant results, only 36% of the replication studies did so (Open Science Collaboration, 2015). This inability to replicate previously published results, however, is not limited to psychology (Ioannidis, 2005). Replication projects in medicine (Prinz et al., 2011) and behavioral economics (Camerer et al., 2016) resulted in replication rates of 25 and 61%, respectively, and analyses in genetics (Munafò, 2009) and neuroscience (Button et al., 2013) question the validity of studies in those fields. Science, in general, is reckoning with challenges in one of its basic tenets: replication. Comparative psychology also faces the grand challenge of producing replicable research. Though social psychology has born the brunt of most of the critique regarding failed replications, comparative psychology suffers from some of the same problems faced by social psychology (e.g., small sample sizes). Yet, comparative psychology follows the methods of cognitive psychology by often using within-subjects designs, which may buffer it from replicability problems (Open Science Collaboration, 2015). In this Grand Challenge article, I explore the shared and unique challenges of and potential solutions for replication and reproducibility in comparative psychology.
Recent replication crises in psychology and other fields have led to intense …
Recent replication crises in psychology and other fields have led to intense reflection about the validity of common research practices. Much of this reflection has focussed on reporting standards, and how they may be related to the questionable research practices that could underlie a high proportion of irreproducible findings in the published record. As a developing field, it is particularly important for Experimental Philosophy to avoid some of the pitfalls that have beset other disciplines. To this end, here we provide a detailed, comprehensive assessment of current reporting practices in Experimental Philosophy. We focus on the quality of statistical reporting and the disclosure of information about study methodology. We assess all the articles using quantitative methods (n = 134) that were published over the years 2013–2016 in 29 leading philosophy journals. We find that null hypothesis significance testing is the prevalent statistical practice in Experimental Philosophy, although relying solely on this approach has been criticised in the psychological literature. To augment this approach, various additional measures have become commonplace in other fields, but we find that Experimental Philosophy has adopted these only partially: 53% of the papers report an effect size, 28% confidence intervals, 1% examined prospective statistical power and 5% report observed statistical power. Importantly, we find no direct relation between an article’s reporting quality and its impact (numbers of citations). We conclude with recommendations for authors, reviewers and editors in Experimental Philosophy, to facilitate making research statistically-transparent and reproducible.
Everything you need to know about this ECR-led journal club initiative that …
Everything you need to know about this ECR-led journal club initiative that helps early career researchers create local Open Science groups that discuss issues, papers and ideas to do with improving science.
Various fields in the natural and social sciences face a ‘crisis of …
Various fields in the natural and social sciences face a ‘crisis of confidence’. Broadly, this crisis amounts to a pervasiveness of non-reproducible results in the published literature. For example, in the field of biomedicine, Amgen published findings that out of 53 landmark published results of pre-clinical studies, only 11% could be replicated successfully. This crisis is not confined to biomedicine. Areas that have recently received attention for non-reproducibility include biomedicine, economics, political science, psychology, as well as philosophy. Some scholars anticipate the expansion of this crisis to other disciplines.This course explores the state of reproducibility. After giving a brief historical perspective, case studies from different disciplines (biomedicine, psychology, and philosophy) are examined to understand the issues concretely. Subsequently, problems that lead to non-reproducibility are discussed as well as possible solutions and paths forward.
As research across domains of study has become increasingly reliant on digital …
As research across domains of study has become increasingly reliant on digital tools (librarianship included), the challenges in reproducibility have grown. Alongside this reproducibility challenge are the demands for open scholarship, such as releasing code, data, and articles under an open license.Before, researchers out in the field used to capture their environments through observation, drawings, photographs, and videos; now, researchers and the librarians who work alongside them must capture digital environments and what they contain (e.g. code and data) to achieve reproducibility. Librarians are well-positioned to help patrons open their scholarship, and it’s time to build in reproducibility as a part of our services.Librarians are already engaged with research data management, open access publishing, grant compliance, pre-registration, and it’s time we as a profession add reproducibility to that repertoire. In this webinar, organised by LIBER’s Research Data Management Working Group, speaker Vicky Steeves discusses how she’s built services around reproducibility as a dual appointment between the Libraries and the Center for Data Science at New York University.
The adoption of reproducibility remains low, despite incentives becoming increasingly common in …
The adoption of reproducibility remains low, despite incentives becoming increasingly common in different domains, conferences, and journals. The truth is, reproducibility is technically difficult to achieve due to the complexities of computational environments.To address these technical challenges, we created ReproZip, an open-source tool that packs research along with all the necessary information to reproduce it, including data files, software, OS version, and environment variables. Everything is then bundled into an .rpz file, which users can use to reproduce the work with ReproUnzip and an unpacker (Docker, Vagrant, and Singularity). The .rpz file is general and contains rich metadata: more unpackers can be added as needed, better guaranteeing long-term preservation.However, installing the unpackers can still be burdensome for secondary users of ReproZip bundles. In this paper, we will discuss how ReproZip and our new tool ReproServer can be used together to facilitate access to well-preserved, reproducible work. ReproServer is a cloud application that allows users to upload or provide a link to a ReproZip bundle, and then interact with/reproduce the contents from the comfort of their browser. Users are then provided a stable link to the unpacked work on ReproServer they can share with reviewers or colleagues.
This course was developed and taught by Ben Marwick, Professor of Archaeology …
This course was developed and taught by Ben Marwick, Professor of Archaeology at University of Washington. It is a requirement for the UW Master of Science in Data Science, introduces students to the principles and tools for computational reproducibility in data science using R. Topics covered include acquiring, cleaning and manipulating data in a reproducible workflow using the tidyverse. Students will use literate programming tools, and explore best practices for organizing data analyses. Students will learn to write documents using R markdown, compile R markdown documents using knitr and related tools, and publish reproducible documents to various common formats. Students will learn strategies and tools for packaging research compendia, dependency management, and containerising projects to provide computational isolation.
Interpreting the first results from the Reproducibility Project: Cancer Biology requires a …
Interpreting the first results from the Reproducibility Project: Cancer Biology requires a highly nuanced approach. Reproducibility is a cornerstone of science, and the development of new drugs and medical treatments relies on the results of preclinical research being reproducible. In recent years, however, the validity of published findings in a number of areas of scientific research, including cancer research, have been called into question (Begley and Ellis, 2012; Baker, 2016). One response to these concerns has been the launch of a project to repeat selected experiments from a number of high-profile papers in cancer biology (Morrison, 2014; Errington et al., 2014). The aim of the Reproducibility Project: Cancer Biology, which is a collaboration between the Center for Open Science and Science Exchange, is two-fold: to provide evidence about reproducibility in preclinical cancer research, and to identify the factors that influence reproducibility more generally.
RQM is a research methods course that focuses on modernizing the post-data …
RQM is a research methods course that focuses on modernizing the post-data collection portion of the scientific workflow. The course takes an approach that produces both conventional research products and trains students to make their work more efficient and reproducible. This handbook provides a framework for professors who would like to teach a 14-week class on reproducible quantitative methods, presuming an understanding of open workflows for publication, some intermediate R or (other command-line based data analysis software) skills, and basic GitHub operations and use.
Modern scientific research takes advantage of programs such as Python and R …
Modern scientific research takes advantage of programs such as Python and R that are open source. As such, they can be modified and shared by the wider community. Additionally, there is added functionality through additional programs and packages, such as IPython, Sweave, and Shiny. These packages can be used to not only execute data analyses, but also to present data and results consistently across platforms (e.g., blogs, websites, repositories and traditional publishing venues).
The goal of the course is to show how to implement analyses and share them using IPython for Python, Sweave and knitr for RStudio to create documents that are shareable and analyses that are reproducible.
Course outline is as follows: 1) Use of IPython notebooks to demonstrate and explain code, visualize data, and display analysis results 2) Applications of Python modules such as SymPy, NumPy, pandas, and SciPy 3) Use of Sweave to demonstrate and explain code, visualize data, display analysis results, and create documents and presentations 4) Integration and execution of IPython and R code and analyses using the IPython notebook
This is the website for the Autumn 2014 course “Reproducible Research Methods” …
This is the website for the Autumn 2014 course “Reproducible Research Methods” taught by Eric C. Anderson at NOAA’s Southwest Fisheries Science Center. The course meets on Tuesdays and Thursdays from 3:30 to 4:30 PM in Room 188 of the Fisheries Ecology Division. It runs from Oct 7 to December 18.
The goal of this course is for scientists, researchers, and students to learn:
to write programs in the R language to manipulate and analyze data, to integrate data analysis with report generation and article preparation using knitr, to work fluently within the Rstudio integrated development environment for R, to use git version control software and GitHub to effectively manage source code, collaborate efficiently with other researchers, and neatly package their research.
Description This hands-on tutorial will train reproducible research warriors on the practices …
Description
This hands-on tutorial will train reproducible research warriors on the practices and tools that make experimental verification possible with an end-to-end data analysis workflow. The tutorial will expose attendees to open science methods during data gathering, storage, analysis, up to publication into a reproducible article.
Attendees are expected to have basic familiarity with scientific Python and Git.
Workshop goals - Why are we teaching this - Why is this …
Workshop goals - Why are we teaching this - Why is this important - For future and current you - For research as a whole - Lack of reproducibility in research is a real problem
Materials and how we'll use them - Workshop landing page, with
- links to the Materials - schedule
Structure oriented along the Four Facets of Reproducibility:
How this workshop is run - This is a Carpentries Workshop - that means friendly learning environment - Code of Conduct - active learning - work with the people next to you - ask for help
Workshop goals - Why are we teaching this - Why is this …
Workshop goals - Why are we teaching this - Why is this important - For future and current you - For research as a whole - Lack of reproducibility in research is a real problem
Materials and how we'll use them - Workshop landing page, with
- links to the Materials - schedule
Structure oriented along the Four Facets of Reproducibility:
How this workshop is run - This is a Carpentries Workshop - that means friendly learning environment - Code of Conduct - active learning - work with the people next to you - ask for help
Workshop goals - Why are we teaching this - Why is this …
Workshop goals - Why are we teaching this - Why is this important - For future and current you - For research as a whole - Lack of reproducibility in research is a real problem
Materials and how we'll use them - Workshop landing page, with
- links to the Materials - schedule
Structure oriented along the Four Facets of Reproducibility:
How this workshop is run - This is a Carpentries Workshop - that means friendly learning environment - Code of Conduct - active learning - work with the people next to you - ask for help
Workshop goals - Why are we teaching this - Why is this …
Workshop goals - Why are we teaching this - Why is this important - For future and current you - For research as a whole - Lack of reproducibility in research is a real problem
Materials and how we'll use them - Workshop landing page, with
- links to the Materials - schedule
Structure oriented along the Four Facets of Reproducibility:
How this workshop is run - This is a Carpentries Workshop - that means friendly learning environment - Code of Conduct - active learning - work with the people next to you - ask for help
Workshop goals - Why are we teaching this - Why is this …
Workshop goals - Why are we teaching this - Why is this important - For future and current you - For research as a whole - Lack of reproducibility in research is a real problem
Materials and how we'll use them - Workshop landing page, with
- links to the Materials - schedule
Structure oriented along the Four Facets of Reproducibility:
How this workshop is run - This is a Carpentries Workshop - that means friendly learning environment - Code of Conduct - active learning - work with the people next to you - ask for help
Workshop goals - Why are we teaching this - Why is this …
Workshop goals - Why are we teaching this - Why is this important - For future and current you - For research as a whole - Lack of reproducibility in research is a real problem
Materials and how we'll use them - Workshop landing page, with
- links to the Materials - schedule
Structure oriented along the Four Facets of Reproducibility:
How this workshop is run - This is a Carpentries Workshop - that means friendly learning environment - Code of Conduct - active learning - work with the people next to you - ask for help
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.