Category Archives: Citation Metrics

Introduction to Altmetric Explorer – Understanding Immediate Visibility of Publications

October 3, 2022

Featuring a spotlight on the extensive attention generated by open access publications. 

Altmetric Explorer is an online tool that monitors attention on the web beyond traditional citations for digitally available research publications. It captures mentions for all types of research outputs, including theses and dissertations, datasets, software and code, media files, etc. 

The training session will be led by Patty Smith of Altmetric. This hour-long session will introduce you to the platform and its features, and how individual researchers can use the platform to illustrate the reach of their work alongside traditional citation-based metrics. In celebration of Open Access Week 2022, we will highlight the wide reach of open access publications in addition to a general introduction. There will be plenty of time for questions. 

Related LibGuide: Alternative Metrics by Emily Nickerson

 

Altmetric Logo

What is Altmetric Explorer?

Altmetrics, in the broader sense, are a way to measure impact by capturing online mentions of research outputs such as papers and datasets. Altmetric Explorer, Plum Analytics and Impactstory are some popular altmetrics tools, and the Library has recently purchased a subscription to Altmetric Explorer.

Using Altmetric Explorer to improve visibility of your work?

Altmetric Explorer is an online tool that searches the web for “online attention” of research outputs. It captures the attention for all types of research outputs including theses and dissertations, datasets, software and code, media files, etc. Altmetric Explorer pulls data from:

  • Public policy documents
  • Mainstream media
  • Post-publication peer-review platforms (Pubpeer and Publons)
  • Wikipedia
  • Social Media (Twitter, Facebook, Reddit, etc)
  • Multimedia platforms (YouTube, Stack Overflow, etc)
  • Patents
  • Open Syllabus Project
  • Blogs & Research Highlights (Faculty Opinions)
  • Online reference managers (Mendeley)

The overall attention accumulates into an algorithm that calculates the Altmetric Score and provides a visualization (the “Altmetric donut“).

Altmetric Explorer is a powerful tool that can provide contextual information when documenting the impact of your work in CVs, tenure & promotion dossiers, or grant and job applications. Not only does it provide insight into the attention your work receives, it also closes a gap where traditional metrics tend to be in the dark, by covering immediate attention.

The tool helps to answer questions about your output such as “Was my work covered by any news outlets?” – “Are other researchers commenting on my work?” – “Which countries are looking at my publications?” – “Was any of my scientific output cited in any policy documents or patents?” or “How does attention of my open access publications compare to those published in closed access?”.

As an additional feature, the Altmetric Score will be displayed for all content in our repository UVicSpace.

Learn more through our libguide

Attend one of our upcoming workshops!

Introducing Altmetric Explorer

February 28, 2022

Have you ever wondered what kind of attention your recently published paper got in the academic community, before the first citations occurred in other literature? Or what coverage it received outside academia without you noticing? You may not have missed a tweet, but what about media, blogs, or policy papers that mention your publication? In fact, you may be aware of some buzz around your latest publication on a hot topic – but could you ever present that in a serious way in your current research funding proposal? And can something like that even be captured in a structured way, or even measured?  

The answer is yes – look no further! We have just the tool for you!  

Starting in 2022, UVic Libraries is providing access to Altmetric Explorer, which serves exactly these purposes (and many more). 

Altmetric Logo

What is Altmetric Explorer?

Altmetrics, in the broader sense, are a way to measure impact by capturing online mentions of research outputs such as papers and datasets. Altmetric Explorer, Plum Analytics and Impactstory are some popular altmetrics tools, and the Library has recently purchased a subscription to Altmetric Explorer.

Using Altmetric Explorer to improve visibility of your work?

Altmetric Explorer is an online tool that searches the web for “online attention” of research outputs. It captures the attention for all types of research outputs including theses and dissertations, datasets, software and code, media files, etc. Altmetric Explorer pulls data from:

  • Public policy documents
  • Mainstream media
  • Post-publication peer-review platforms (Pubpeer and Publons)
  • Wikipedia
  • Social Media (Twitter, Facebook, Reddit, etc)
  • Multimedia platforms (YouTube, Stack Overflow, etc)
  • Patents
  • Open Syllabus Project
  • Blogs & Research Highlights (Faculty Opinions)
  • Online reference managers (Mendeley)

The overall attention accumulates into an algorithm that calculates the Altmetric Score and provides a visualization (the “Altmetric donut“).

Altmetric Explorer is a powerful tool that can provide contextual information when documenting the impact of your work in CVs, tenure & promotion dossiers, or grant and job applications. Not only does it provide insight into the attention your work receives, it also closes a gap where traditional metrics tend to be in the dark, by covering immediate attention.

The tool helps to answer questions about your output such as “Was my work covered by any news outlets?” – “Are other researchers commenting on my work?” – “Which countries are looking at my publications?” – “Was any of my scientific output cited in any policy documents or patents?” or “How does attention of my open access publications compare to those published in closed access?”.

As an additional feature, the Altmetric Score will be displayed for all content in our repository UVicSpace.

Learn more through our libguide

Attend one of our upcoming workshops!

Updating journal impact scores

Science | Jeffrey Brainard and Matt Warren | June 27, 2018

Clarivate Analytics issued an update (June 26) of the Journal impact factors. It now includes “a distribution curve displaying the total number of articles and other items published in a journal versus the number of times each item was cited. The median number of citations for all of the journal’s research articles and review articles is also identified on the curve.”

“Users can drill down into the underlying data to see, for example, the titles of the most highly cited items and, in a separate list, the citations and articles that went into the calculation of the journal’s impact factor.

The dashboard also displays summary information characterizing a journal’s citations by type of article. This allows users to see, for example, what proportion came from research articles versus review articles. Another chart shows how the journal’s impact factor has fluctuated over recent years.”

For more information see: http://www.sciencemag.org/news/2018/06/firm-tallies-controversial-journal-impact-scores-moves-provide-more-context

CARL Joins Thousands Calling for Improvements to Research Assessment

April 3, 2018

The Canadian Association of Research Libraries (CARL) has signed on to the San Francisco Declaration on Research Assessment (DORA), which recommends changes in practices by the research community regarding the use of research metrics.

For more see: http://www.carl-abrc.ca/news/carl-signs-dora-declaration/

How does Wikipedia utilize scientific research?

Researchers at the University of Chicago’s Knowledge Lab have been exploring how Wikipedia cites academic research, in order to assess the quality and type of material used by Wikipedia contributors.

Their findings suggest that “controlling for field and impact factor, the odds that an open access journal is referenced on the English Wikipedia are 47% higher compared to closed access journals. Moreover, in most of the world's Wikipedias, a journal's high status (impact factor) and accessibility (open access policy) both greatly increase the probability of referencing. Among the implications of this study is that the chief effect of open access policies may be to significantly amplify the diffusion of science, through an intermediary like Wikipedia, to a broad public audience.”

A pre-print of the article has been made available on arXiv.org.


Scopus announces evaluation process for journals

Scopus, the self-described “largest abstract and citation database of peer-reviewed literature: scientific journals, books and conference proceedings”, has announced that it will begin an annual review process in order to maintain quality control over the journals it indexes. The review process will, among other things, be looking to remove journals that attempt to bolster their impact factor by relying heavily on self-citation.

Read more about it on the Scholarly Kitchen blog.