Students overlay USGS topographic maps into Google Earth’s satellite imagery. By analyzing …
Students overlay USGS topographic maps into Google Earth’s satellite imagery. By analyzing Denali, a mountain in Alaska, they discover how to use map scales as ratios to navigate maps, and use rates to make sense of contour lines and elevation changes in an integrated GIS software program. Students also problem solve to find potential pathways up a mountain by calculating gradients.
Journal policy on research data and code availability is an important part …
Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.
Several fields of science are experiencing a "replication crisis" that has negatively …
Several fields of science are experiencing a "replication crisis" that has negatively impacted their credibility. Assessing the validity of a contribution via replicability of its experimental evidence and reproducibility of its analyses requires access to relevant study materials, data, and code. Failing to share them limits the ability to scrutinize or build-upon the research, ultimately hindering scientific progress.Understanding how the diverse research artifacts in HCI impact sharing can help produce informed recommendations for individual researchers and policy-makers in HCI. Therefore, we surveyed authors of CHI 2018–2019 papers, asking if they share their papers' research materials and data, how they share them, and why they do not. The results (N = 460/1356, 34% response rate) show that sharing is uncommon, partly due to misunderstandings about the purpose of sharing and reliable hosting. We conclude with recommendations for fostering open research practices.This paper and all data and materials are freely available at https://osf.io/csy8q
Background: Reproducible research is a foundational component for scientific advancements, yet little …
Background: Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature. Objective: This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices. Methods: By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018. After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form. Results: After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts. Conclusions: Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted. More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology. [JMIR Dermatol 2019;2(1):e16078]
A study by David Baker and colleagues reveals poor quality of reporting …
A study by David Baker and colleagues reveals poor quality of reporting in pre-clinical animal research and a failure of journals to implement the ARRIVE guidelines. There is growing concern that poor experimental design and lack of transparent reporting contribute to the frequent failure of pre-clinical animal studies to translate into treatments for human disease. In 2010, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) guidelines were introduced to help improve reporting standards. They were published in PLOS Biology and endorsed by funding agencies and publishers and their journals, including PLOS, Nature research journals, and other top-tier journals. Yet our analysis of papers published in PLOS and Nature journals indicates that there has been very little improvement in reporting standards since then. This suggests that authors, referees, and editors generally are ignoring guidelines, and the editorial endorsement is yet to be effectively implemented.
This Github repository contains curriculum and materials for courses and workshops taught …
This Github repository contains curriculum and materials for courses and workshops taught through the University of Miami Libraries.
If you are not looking for the repository, but simply the curriculum materials, please see the hosted version: https://umiamilibraries.github.io/courses-and-workshops/.
The repository was started through the Data Curation Initiative (http://library.miami.edu/datacuration) at the University of Miami Libraries (http://library.miami.edu).
The repository was created by Tim Norris with help and inspiriation from many others including Elizabeth Fish, Angela Clark, and all the students, faculty, and staff who have participated in the seminar.
These six interactive modules help researchers, data stewards, managers and the public …
These six interactive modules help researchers, data stewards, managers and the public gain an understanding of the value of data management in science and provide best practices to perform good data management within their organization.
Acknowledgments The USGS Data Management Training modules were funded by the USGS Community for Data Integration and the USGS Office of Organizational and Employee Development's Technology Enabled Learning Program in collaboration with Bureau of Land Management, California Digital Library, and Oak Ridge National Laboratory. Special thanks to Jeffrey Morisette, Dept. of the Interior North Central Climate Science Center; Janice Gordon, USGS Science Analytics and Synthesis; National Indian Programs Training Center; and Keith Kirk, USGS Office of Science Quality Information.
Cite: U.S. Geological Survey, 2021, USGS Data Management Website: U.S. Geological Survey, https://doi.org/10.5066/F7MW2G15.
Software Carpentry lesson on how to use the shell to navigate the …
Software Carpentry lesson on how to use the shell to navigate the filesystem and write simple loops and scripts. The Unix shell has been around longer than most of its users have been alive. It has survived so long because it’s a power tool that allows people to do complex things with just a few keystrokes. More importantly, it helps them combine existing programs in new ways and automate repetitive tasks so they aren’t typing the same things over and over again. Use of the shell is fundamental to using a wide range of other powerful tools and computing resources (including “high-performance computing†supercomputers). These lessons will start you on a path towards using these resources effectively.
Unmanned Aircraft Systems (UAS) are an integral part of the US national …
Unmanned Aircraft Systems (UAS) are an integral part of the US national critical infrastructure. They must be protected from hostile intent or use to the same level as any other military or commercial asset involved in US national security. However, from the Spratly Islands to Djibouti to heartland America, the expanding Chinese Unmanned Aircraft Systems (UAS / Drone) industry has outpaced the US technologically and numerically on all fronts: military, commercial, and recreational.
Both countries found that there were large information security gaps in unmanned systems that could be exploited on the international cyber-security stage. Many of those gaps remain today and are direct threats to US advanced Air Assets if not mitigated upfront by UAS designers and manufacturers. The authors contend that US military / commercial developers of UAS hardware and software must perform cyber risk assessments and mitigations prior to delivery of UAS systems to stay internationally competitive and secure.
The authors have endeavored to bring a breadth and quality of information to the reader that is unparalleled in the unclassified sphere. This book will fully immerse and engage the reader in the cyber-security considerations of this rapidly emerging technology that we know as unmanned aircraft systems (UAS). Topics covered include National Airspace (NAS) policy issues, information security, UAS vulnerabilities in key systems (Sense and Avoid / SCADA), collision avoidance systems, stealth design, intelligence, surveillance and reconnaissance (ISR) platforms; weapons systems security; electronic warfare considerations; data-links, jamming operational vulnerabilities and still-emerging political scenarios that affect US military / commercial decisions.
The CONsolidated Standards Of Reporting Trials (CONSORT) Statement provides a minimum standard …
The CONsolidated Standards Of Reporting Trials (CONSORT) Statement provides a minimum standard set of items to be reported in published clinical trials; it has received widespread recognition within the biomedical publishing community. This research aims to provide an update on the endorsement of CONSORT by high impact medical journals. Methods We performed a cross-sectional examination of the online “Instructions to Authors” of 168 high impact factor (2012) biomedical journals between July and December 2014. We assessed whether the text of the “Instructions to Authors” mentioned the CONSORT Statement and any CONSORT extensions, and we quantified the extent and nature of the journals’ endorsements of these. These data were described by frequencies. We also determined whether journals mentioned trial registration and the International Committee of Medical Journal Editors (ICMJE; other than in regards to trial registration) and whether either of these was associated with CONSORT endorsement (relative risk and 95 % confidence interval). We compared our findings to the two previous iterations of this survey (in 2003 and 2007). We also identified the publishers of the included journals. Results Sixty-three percent (106/168) of the included journals mentioned CONSORT in their “Instructions to Authors.” Forty-four endorsers (42 %) explicitly stated that authors “must” use CONSORT to prepare their trial manuscript, 38 % required an accompanying completed CONSORT checklist as a condition of submission, and 39 % explicitly requested the inclusion of a flow diagram with the submission. CONSORT extensions were endorsed by very few journals. One hundred and thirty journals (77 %) mentioned ICMJE, and 106 (63 %) mentioned trial registration. Conclusions The endorsement of CONSORT by high impact journals has increased over time; however, specific instructions on how CONSORT should be used by authors are inconsistent across journals and publishers. Publishers and journals should encourage authors to use CONSORT and set clear expectations for authors about compliance with CONSORT.
Students will predict how many amusement parks are in their state. They …
Students will predict how many amusement parks are in their state. They will then analyze census data on the numbers of amusement parks in all 50 states in 2016. (Data in this activity do not include the District of Columbia or Puerto Rico.) Then students will write numbers as fractions and create a visual model of the data.
In this lesson, students will take temperature readings in the outdoor classroom, …
In this lesson, students will take temperature readings in the outdoor classroom, compare them to data from a graph, and discuss the numerical differences between the readings and the data.
In this lesson, students will take temperature readings in the outdoor classroom, …
In this lesson, students will take temperature readings in the outdoor classroom, compare them to data from a graph, and discuss the numerical differences between the readings and the data.
This lesson sought to engage students in data interpretation and to encourage …
This lesson sought to engage students in data interpretation and to encourage critical thinking about neurophysiology in the context of temperature sensation. A family of receptors called Transient Receptor Potential (TRP) channels are activated in response to specific temperatures. Upon activation, TRP channels can trigger sensory neurons to signal the perception of temperature. In this lesson, that we have tested with nearly 1000 students in 12 class sections over four years, students worked in groups of three to identify a "mystery" TRP channel by interpreting five different sets of data. In this activity, we used an approach that we call Sequential Interpretation of Data in an Envelope (SIDE), where students sequentially analyze primary source-like data that are on folded pieces of paper in an envelope. Students analyzed data and hypothesized which TRP channel(s) might be their mystery TRP channel. Students then analyzed four additional sets of data from different experiments and revised their hypotheses about their mystery TRP channel after each experiment. There are four different mystery TRP channels so different groups of three students each analyzed different data and came to different conclusions. This lesson gave students the opportunity to analyze data from multiple experiments. We assessed learning through an in-class worksheet, which students completed as they interpreted each data set and a post-class homework assignment, which students completed online.
The Data Management and Sharing Plan (DMSP) Tool, or DMPTool, is a …
The Data Management and Sharing Plan (DMSP) Tool, or DMPTool, is a free resource for anyone to use that helps researchers create data management sharing plans as they write their funding proposal.
By the end of this tutorial, you will be able to: - Log in to the DMPTool as an institutional affiliate. - Access and use existing data management plans and templates. - Identify project details for your plan that meet funder and institutional guidelines. - Identify research outputs needed to meet funder and institutional guidelines. - Request expert feedback for your plan.
Recall the steps to save, download, and submit your plan to your Research Administrator and submit updates as your project progresses.
This lesson is part of the Software Carpentry workshops that teach how …
This lesson is part of the Software Carpentry workshops that teach how to use version control with Git. Wolfman and Dracula have been hired by Universal Missions (a space services spinoff from Euphoric State University) to investigate if it is possible to send their next planetary lander to Mars. They want to be able to work on the plans at the same time, but they have run into problems doing this in the past. If they take turns, each one will spend a lot of time waiting for the other to finish, but if they work on their own copies and email changes back and forth things will be lost, overwritten, or duplicated. A colleague suggests using version control to manage their work. Version control is better than mailing files back and forth: Nothing that is committed to version control is ever lost, unless you work really, really hard at it. Since all old versions of files are saved, it’s always possible to go back in time to see exactly who wrote what on a particular day, or what version of a program was used to generate a particular set of results. As we have this record of who made what changes when, we know who to ask if we have questions later on, and, if needed, revert to a previous version, much like the “undo†feature in an editor. When several people collaborate in the same project, it’s possible to accidentally overlook or overwrite someone’s changes. The version control system automatically notifies users whenever there’s a conflict between one person’s work and another’s. Teams are not the only ones to benefit from version control: lone researchers can benefit immensely. Keeping a record of what was changed, when, and why is extremely useful for all researchers if they ever need to come back to the project later on (e.g., a year later, when memory has faded). Version control is the lab notebook of the digital world: it’s what professionals use to keep track of what they’ve done and to collaborate with other people. Every large software development project relies on it, and most programmers use it for their small jobs as well. And it isn’t just for software: books, papers, small data sets, and anything that changes over time or needs to be shared can and should be stored in a version control system.
This webinar will introduce the concept of version control and the version …
This webinar will introduce the concept of version control and the version control features that are built into the Open Science Framework (OSF; https://osf.io). The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github and Mendeley, to streamline workflows and increase efficiency. This webinar will discuss how keeping track of the different file versions is important for efficient reproducible research practices, how version control works on the OSF, and how researchers can view and download previous versions of files.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.