In 2016 the LIS-Bibliometrics Forum commissioned the development of a set of …
In 2016 the LIS-Bibliometrics Forum commissioned the development of a set of bibliometric competencies (2017 Model), available at https://thebibliomagician.wordpress.com/2017-competencies-archived/. The work, sponsored by a small research grant from Elsevier Research Intelligence Division, was led by Dr. Andrew Cox at the University of Sheffield, and Dr. Sabrina Petersohn of the Bergische Universität Wuppertal, Germany. The aim of the competency statements was to ensure that bibliometric practitioners were equipped to do their work responsibly and well.
The Competency Model was updated in July 2021 and includes a colour gradient to reflect the Levels and how they build upon one another. In particular, the 2021 competencies can help:
To identify skills gaps To support progression through career stages for practitioners in the field of bibliometrics To prepare job descriptions
The work underpinning the paper is available here: http://journals.sagepub.com/doi/abs/10.1177/0961000617728111. It is intended that the competencies are a living document and will be reviewed over time.
Being active in social media, like in Twitter and Blogs, is one …
Being active in social media, like in Twitter and Blogs, is one way to reach a larger audience and to enhance a researcher’s impact. Other researchers will learn about their findings through these additional channels and in addition the public, policy makers, and the press. The toolkit shows several ways of how to get in touch with other researchers and discuss findings at an early stage in research networks, conferences, and in social media. It presents open tools for co-writing, online meetings, reference- and project management.
Created as a supplement for the Impact Measurement collection of the ScholarlyCommunication …
Created as a supplement for the Impact Measurement collection of the ScholarlyCommunication Notebook (SCN) to describe some of the core literature in the field as well asresources that cannot be included on the SCN, because they are not openly licensed but arefree to read.This annotated bibliography is separated into three sections: Peer reviewed scholarly articles,Blog posts, initiatives, and guides, and Resources for further education and professionaldevelopment. The first section is intended to help practitioners in the field of researchassessment and bibliometrics to understand high-level core concepts in the field. The secondsection offers resources that are more applicable to practice. The final section includes links toblogs, communities, discussion lists, paid and free educational courses, and archivedconferences, so that practitioners and professionals can stay abreast of emerging trends,improve their skills, and find community. Most of these resources could not be included on theScholarly Communication Notebook, because they are not openly licensed. However, allresources on this bibliography are freely available to access and read.
Slides from the Keynote talk given at Virginia Tech Open Access Week …
Slides from the Keynote talk given at Virginia Tech Open Access Week on 20 October 2020. See the full presentation recording and panel discussion at https://vtechworks.lib.vt.edu/handle/10919/100682.
Virginia Tech's Open Access Week 2020 keynote speaker, Elizabeth (Lizzie) Gadd, Research …
Virginia Tech's Open Access Week 2020 keynote speaker, Elizabeth (Lizzie) Gadd, Research Policy Manager (Publications) at Loughborough University in the UK, gives a talk about how what we reward through recruitment, promotion and tenure processes is not always what we actually value about research activity. The talk explores how we can pursue value-led evaluations - and how we can persuade senior leaders of their benefits.
The keynote talk is followed by a panel discussion with faculty members at Virginia Tech: Thomas Ewing (Associate Dean for Graduate Studies and Research and Professor of History), Carla Finkielstein (Associate Professor of Biological Sciences), Bikrum Gill (Assistant Professor of Political Science), and Sylvester Johnson (Professor and Director of the Center for Humanities. The panel is moderated by Tyler Walters (Dean, University Libraries).
The slides from this presentation are in Loughborough University's repository under a CC BY-NC-SA 4.0 license. https://repository.lboro.ac.uk/articles/presentation/Counting_what_counts_in_recruitment_promotion_and_tenure/13113860
The signatories of the Helsinki Initiative on Multilingualism in Scholarly Communication support …
The signatories of the Helsinki Initiative on Multilingualism in Scholarly Communication support recommendations to keep research international and multilingual to be adopted by policy-makers, leaders, universities, research institutions, research funders, libraries, and researchers. This initiative helps to support bibliodiversity, protect locally relevant research, and promote language diversity in research evaluation. Signatories, events, media, and more information can be found at https://www.helsinki-initiative.org/
HuMetricsHSS supports the creation of values-based frameworks to guide all kinds of …
HuMetricsHSS supports the creation of values-based frameworks to guide all kinds of scholarly process, and to promote the nurturing of a values-enacted approach to academia writ large. During the 2016 Triangle Scholarly Communication Institute (SCI), the authors sketched a preliminary set of core values for enriching scholarship, highlighting five: Equity, Openness, Collegiality, Quality, Community. They created a framework which is intended to help transform how scholarship is created, assessed, and valued in the humanities.
At the workshops and in the toolkit, they emphasize that values are locally negotiated and frameworks locally built. That’s the explicit point of the workshop, to make space for open conversation about values and their meaning, to come to agreement on what matters for a given group, and then to work on constructing a framework that could be used to guide evaluation in the academy — whether that’s through the tenure and promotion process, the setting of annual goals, the hiring of new faculty, or decision-making about what kinds of digitization projects to take on, what kinds of collections to develop, or what kinds of projects to publish at an academic press.
The programme aims to equip learners with the skills and knowledge required …
The programme aims to equip learners with the skills and knowledge required to engage in the use of a range of metrics around research impact and gain understanding of the research landscape. This is a flexible programme – you can do as much or as little as suits you. While some Things are interlinked, each of the Things is designed to be completed separately, in any order and at any level of complexity. Choose your own adventure!
There are three levels for each Thing:
Getting started is for you if you are just beginning to learn about each topic Learn more is if you know a bit but want to know more Challenge me is often more in-depth or assumes that you are familiar with at least the basics of each topic
Journal article abstract: With the help of academic search engine optimization (ASEO), …
Journal article abstract: With the help of academic search engine optimization (ASEO), publications can more easily be found in academic search engines and databases. Authors can improve the ranking of their publications by adjusting titles, keywords and abstracts. Carefully considered wording makes publications easier to find and, ideally, cited more often. This article is meant to support authors in making their scholarly publications more visible. It provides basic information on ranking mechanisms as well as tips and tricks on how to improve the findability of scholarly publications while also pointing out the limits of optimization. This article, authored by three scholarly communications librarians, draws on their experience of hosting journals, providing workshops for researchers and individual publication support, as well as on their investigations of the ranking algorithms of search engines and databases.
What does it mean to have meaningful metrics in today’s complex higher …
What does it mean to have meaningful metrics in today’s complex higher education landscape? With a foreword by Heather Piwowar and Jason Priem, this highly engaging and activity-laden book serves to introduce readers to the fast-paced world of research metrics from the unique perspective of academic librarians and LIS practitioners. Starting with the essential histories of bibliometrics and altmetrics, and continuing with in-depth descriptions of the core tools and emerging issues at stake in the future of both fields, Meaningful Metrics is a convenient all-in-one resource that is designed to be used by a range of readers, from those with little to no background on the subject to those looking to become movers and shakers in the current scholarly metrics movement. Authors Borchardt and Roemer, offer tips, tricks, and real-world examples illustrate how librarians can support the successful adoption of research metrics, whether in their institutions or across academia as a whole.
This UK report presents the findings and recommendations of the Independent Review …
This UK report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises.
The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to …
The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to help scholars and evaluators understand and use citations, web metrics, and altmetrics responsibly in the evaluation of research.
The Metrics Toolkit provides evidence-based information about research metrics across disciplines, including how each metric is calculated, where you can find it, and how each should (and should not) be applied. You’ll also find examples of how to use metrics in grant applications, CV, and promotion packages.
This resource links to the full course (all 13 weeks of modules) …
This resource links to the full course (all 13 weeks of modules) on the Internet Archive. The video lectures for the courses are also available on YouTube at https://www.youtube.com/watch?v=maRP_Wvc4eY&list=PLWYwQdaelu4en5MZ0bbg-rSpcfb64O_rd
This series was designed and taught by Chris Belter, Ya-Ling Lu, and Candace Norton at the NIH Library. It was originally presented in weekly installments to NIH Library staff from January-May 2019 and adapted for web viewing later the same year.
The goal of the series is to provide free, on-demand training on how we do bibliometrics for research evaluation. Although demand for bibliometric indicators and analyses in research evaluation is growing, broadly available and easily accessible, training on how to provide those analyses is scarce. We have been providing bibliometric services for years, and we wanted to share our experience with others to facilitate the broader adoption of accurate and responsible bibliometric practice in research assessment. We hope this series acts as a springboard for others to get started with bibliometrics so that they feel more comfortable moving beyond this series on their own.
Navigating the Series The training series consists of 13 individual courses, organized into 7 thematic areas. Links to each course in the series are provided on the left. Each course includes a training video with audio transcription, supplemental reading to reinforce the concepts introduced in the course, and optional practice exercises.
We recommend that the courses be viewed in the order in which they are listed. The courses are listed in the same order as the analyses that we typically perform to produce one of our standard reports. Many of the courses also build on concepts introduced in previous courses, and may be difficult to understand if viewed out of order. We also recommend that the series be taken over the course of 13 consecutive weeks, viewing one course per week. A lot is covered in these courses, so it is a good idea to take your time with them to make sure you understand each course before moving on to the next. We also recommend you try to complete the practice exercises that accompany many of the courses, because the best way to learn bibliometrics is by doing it.
This 25-min course, from the University of Glasgow looks at: the thinking …
This 25-min course, from the University of Glasgow looks at: the thinking behind a move towards narrative CV and assessment formats; how the research landscape and research assessment practices are evolving and efforts to develop fairer assessment approaches; advice and tips on what to include in a more narrative format; and examples from real narrative CVs, written by early-career researchers. This course is directed at early-career researchers, specifically those who are making use of the Resume for Researchers format (e.g., via the UK Research and Innovation (UKRI), which is a non-departmental public body of the Government of the United Kingdom that directs research and innovation funding). Many funding agencies, the industry and corporate sector, and universities now require a more narrative-style CV to incorporate qualitative aspects into job applications (e.g. particularly in relation to describing input to publications, and the significance of these).
The goal of these formats is to help researchers to share their varied contributions to research in a consistent way and across a wide range of career paths and personal circumstances, and to move away from relying on narrowly focused performance indicators that can make it harder to assess, reward or nurture the full range of contributions that a researcher or academic makes to their field or discipline. This course helps researchers to structure, write, and craft a narrative CV to highlight and emphasize their individual academic accomplishments, contributions with a particular emphasis on 'how' they contributed rather than only 'what' they contribute.
These research metric source cards provide the citation for a scholarly work …
These research metric source cards provide the citation for a scholarly work and the research metrics of that work, which can include: the Altmetric Attention Score, the scholarly citation counts from different data sources, and field-weighted citation indicators; in addition, abstracts and important context to some of the metrics is also included, e.g., citation statements, titles of select online mentions, such as news and blog article titles, Wikipedia pages, patent citations, and the context behind those online mentions. There are four printable source cards (front and back) followed by activity questions for each source card. These cards help students engage in and interrogate the meaning behind bibliometrics and altmetrics of specific scholarly works as well as evaluate the credibility, authority, and reliability of the scholarly work itself.
The Declaration on Research Assessment (DORA) recognizes the need to improve the …
The Declaration on Research Assessment (DORA) recognizes the need to improve the ways in which the outputs of scholarly research are evaluated. The declaration was developed in 2012 during the Annual Meeting of the American Society for Cell Biology in San Francisco. It has become a worldwide initiative covering all scholarly disciplines and all key stakeholders including funders, publishers, professional societies, institutions, and researchers. The DORA initiative encourages all individuals and organizations who are interested in developing and promoting best practice in the assessment of scholarly research to sign DORA.
Other resources are available on their website, such as case studies of universities and national consortia that demonstrate key elements of institutional change to improve academic career success.
This guide has been created by bibliometric practitioners to support other users …
This guide has been created by bibliometric practitioners to support other users of InCites, a research analytics tool from Clarivate Analytics that uses bibliographic data from Web of Science; the guide promotes a community of informed and responsible use of research impact metrics. The recommendations in this document may be more suited to other academic sector users, but the authors hope that other users may also benefit from the suggestions. The guide aims to provide plain-English definitions, key strengths and weaknesses and some practical application tips for some of the most commonly-used indicators available in InCites. The indicator definitions are followed by explanations of the data that powers InCites, attempting to educate users on where the data comes from and how the choices made in selecting and filtering data will impact on final results. Also in this document are a comparative table to highlight differences between indicators in InCites and SciVal, another commonly used bibliometric analytic programme, and instructions on how to run group reports. All of the advice in this document is underpinned by a belief in the need to use InCites in a way that respects the limitations of indicators as quantitative assessors of research outputs. Both of the authors are members of signatory institutions of DORA, the San Francisco Declaration on Research Assessment. A summary of advice to using indicators and bibliometric data responsibly is available on pages 4-5 and should be referred to throughout. Readers are also recommended to refer to the official InCites Indicators Handbook produced by Clarivate Analytics. The guide was written with complete editorial independence from Clarivate Analytics, the owners of InCites. Clarivate Analytics supported the authors of this document with checking for factual accuracy only.
This guide is designed to help those who use SciVal, a research …
This guide is designed to help those who use SciVal, a research analytics tool from Elsevier that sources bibliographic data from Scopus, to source and apply bibliometrics in academic institutions. It was originally devised in February 2018 by Dr. Ian Rowlands of King’s College London as a guide for his university, which makes SciVal widely available to its staff. King’s does this because it believes that bibliometric data are best used in context by specialists in the field. A small group of LIS-Bibliometrics committee members reviewed and revised the King’s guide to make it more applicable to a wider audience. SciVal is a continually updated source and so feedback is always welcome at LISBibliometrics@jiscmail.ac.uk. LIS-Bibliometrics is keen that bibliometric data should be used carefully and responsibly and this requires an understanding of the strengths and limitations of the indicators that SciVal publishes.
The purpose of this Guide is to help researchers and professional services staff to make the most meaningful use of SciVal. It includes some important `inside track’ insights and practical tips that may not be found elsewhere. The scope and coverage limitations of SciVal are fairly widely understood and serve as a reminder that these metrics are not appropriate in fields where scholarly communication takes place mainly outside of the journals and conference literature. This is one of the many judgment calls that need to be made when putting bibliometric data into their proper context. One of the most useful features of SciVal is the ability to drill down in detail using various filters. This allows a user to define a set of publications accurately, but that may mean generating top level measures that are based on small samples with considerable variance. Bibliometrics distributions are often highly skewed, where even apparently simple concepts like the `average’ can be problematic. So one objective of this Guide is to set out some advice on sample sizes and broad confidence intervals, to avoid over-interpreting the headline data. Bibliometric indicators should always be used in combination, not in isolation, because each can only offer partial insights. They should also be used in a 'variable geometry' along with other quantitative and qualitative indicators, including expert judgments and non-publication metrics, such as grants or awards, to flesh out the picture.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.