Alternative metrics (altmetrics) – better known as new ways to measure research impact – raise a lot of questions amongst the scientific community. What do these metrics actually mean? And more importantly, what do they actually measure? It’s hard to measure the impact of a research article based on how many times it has been tweeted or posted to facebook: how does that prove that the person posting it actually read the article? Or used it within their own research?
Personally, I love the idea of altmetrics, but I don’t think it has quite reached the point where we can compare it to the impact-factor or the h-index of a journal article (although these are ultimately flawed as well). Heather Piowowar does an excellent job of describing altmetrics from her article in Nature and it aligns well with my own ideas of what altmetrics try to achieve:
“Altmetrics give a fuller picture of how research products have influenced conversation, thought and behaviour.”
I like to think of the “fuller picture” of altmetrics as the evolving story of a journal article. Altmetrics doesn’t necessarily tell us how influential or prominent a journal article has been, but it tells us about how it has been used, shared and communicated over time via social media, the web and the scholarly community. Eventually, I think that the emergence of several prominent altmetric platforms there will eventually lead to a more effective way to evaluate scholarly impact in the form of a hybrid system. In fact, an article written yesterday by Pat Loria from LSE blogs states that “as more systems incorporate altmetrics into their platforms, institutions will benefit from creating an impact management system to interpret these metrics, pulling in information from research managers, ICT and systems staff, and those creating the research impact”. His post is definitely worth a read and would be a great follow up to the content I will present here. He even compares several of the altmetrics platforms that I will outline in this post.
For this post, I thought it would be a good idea to introduce some of the most prominent altmetric platforms within the scholarly publication ecosystem. Below I will describe each altmetric platform and explain how it communicates the impact and metrics of scholarly research to hopefully provide a better understanding of how this type of measurement works.
ImpactStory aligns well with my idea of altmetrics because its goal is to tell the story of how research and scholarly publications are shared and discussed. ImpactStory tracks metrics across a variety of commonly used services such as Delicious, Scopus, Mendeley, PubMed and even SlideShare (among many others). You can import your Google Scholar profile, or even your Dryad records. Once you have imported the service you want to measure, Impact Story tells you how many times an article has been saved by scholars, how many times it has been cited by scholars, how many people have discussed it in public (via Twitter, Facebook, etc.) and how many times it has been cited by the public (eg. Wikipedia article, Blog post).
Anyone who has research material in any of the platforms that ImpactStory supports can view their metrics very easily by creating their own collection. Researchers can also embed a widget into their websites that will attach ImpactStory metrics to their citations, indicating if an article is highly discussed or cited by scholars and the public. I think ImpactStory is an excellent model for altmetrics because it is comprised of traditional metrics and new, social metrics suitable for discovering web impact.
Perhaps the most well known of the altmetrics tools, Altmetric provides three main products that provide embeddable content about particular journal articles. The most prominent product from Altmetric is their Explorer program; this program is comprehensive in that it provides information about how many times an article has been viewed and the rankings from the journal they are from. Explorer also provides a list of social components like how many times an article has been picked up on a news feed; how often it has been tweeted; who has discussed it on Google+ and several other social media platforms. Using Explorer a researcher can even see the demographics of who has seen their article. This is an excellent feature as it provides people with an idea of who is looking at the material. As a librarian, I would be interested to know who is looking at my research: librarians? doctors? the scientific research community?
Altmetric also provides services for publishers where they can embed Altmetric badges that will provide additional information about their articles. Publishers can customize their pages that present the metrics so that their branding can be included.
Finally, Altmetric has a bookmarklet that will provide altmetrics about an article you’re reading. I personally use this feature for fun because it is interesting to learn a little bit more about how an article has been used.. The only problem is that Altmetric does not have the data for every single journal publication. This means that a large portion of the time I’m clicking on the bookmarklet for an article that I’m reading and there is no data available. This is the case especially with library literature – this could be incentive to try and get the LISA and LISTA databases on board. Either way, if you’re interested you can add the bookmarklet HERE.
Plum Analytics is the third power player in the altmetrics arena. The goal of Plum Analytics is to ” to give researchers and funders a data advantage when it come to conveying a more comprehensive and time impact of their output”. Plum collects altmetrics and categorizes their metrics into five different groups: usage, captures, mentions, social media, and citations.
For usage, Plum looks at downloads, views, book holdings, ILL, and document delivery. This is where the library component comes in. If altmetric platforms like Plum are tracking ILL’s and document delivery requests for research literature, librarians should be aware of this and look to contribute to the effort.
The second category, captures, provides information about the favorites, bookmarks, saves, readers, groups, and watchers of an article.
Mentions cover the blog posts, news stories, Wikipedia articles, comments, and reviews of research articles.
Social media refers to the tweets, shares, +1’s and likes based on a research article, and finally citations in Plum Analytics currently cover PubMed, Scopus and Patent citations. You can look at their information page to see how they define all of their terminology.
Peer Evaluation is a different sort of altmetric platform in that it is designed an open peer review service where researchers can curate their own peer review process for scholarly publications. The goal of peer evaluation is for researchers to make their work visible within their community, and be able to track the impact and reuse of what they share. Researchers can submit their articles, data, working papers, books, etc. to Peer Evaluation and have other researchers review their work. Furthermore, because this is a community effort the researcher can in turn review other peoples work as well. Peer Evaluation provides qualitative and quantitative metrics that help the researcher understand the impact of their work, and then be able to share their feedback with others in their community. This idea is very unique within the altmetrics realm, and there has been a considerable amount of participation from the scientific community.
Research Scorecard is a company devoted to “characterizing and quantifying scientific expertise to facilitate scientific collaborations”. Focusing primarily on the biotechnology and pharmaceutical domains, Research Scorecard builds reports and databases for researchers and academic institutions to evaluate the products that they use and how they are used, the people that they collaborate with, the metrics about a specific scientist or researcher, and the funding history of an individual or organization. Research Scorecard is slightly more commercialized than the other platforms that I’ve mentioned here, but I still think it provides valuable information about products, services and researchers within the scientific community.
Librarians! How can we participate?
Librarians should be thinking about how we can best incorporate altmetrics into our own work lives. Librarians working in research environments will need to keep up with altmetrics to evaluate the impact of literature needed for their collection, and to direct researchers to high impact journals for publishing. The shift towards open access publishing will also make altmetrics a valuable tool for librarians to evaluate the impact and quality of these publications. As an academic librarian, I would love to see tools like Altmetric Explorer embedded into a university’s discovery search system or institutional repository.
I think that as altmetrics start to develop a more comprehensive picture of scholarly impact, we will begin to see wider adoption from the scientific community. As Loria states in his blog post, the combination of several platforms in what he calls an Impact Management System (IMS) will be the turning point for altmetrics. If an IMS service can combine all of these research outputs and impacts into one system, it can facilitate the dissemination of a more complete set of research metrics including everything from community and academic impacts to social communication indicators.
Loria makes the point that: “Librarians can help, with their data management skills and aptitude for storytelling.” I have no doubt in my mind that librarians can help, but it is up to us to reach out to these altmetric communities early on so that we can contribute in any way we can. I think it is at least our duty to educate ourselves on the benefits of altmetrics and their potential significance for informing the patrons that we serve.
Other Altmetric Platforms
1. Loria P. The new metrics cannot be ignored – we need to implement centralised impact management systems to understand what these numbers mean [Internet]. London School of Economics and Political Science Blog. 2013. Available from: http://blogs.lse.ac.uk/impactofsocialsciences/2013/03/05/the-new-metrics-cannot-be-ignored/
2. Piwowar H. Altmetrics: Value of all research products [Internet]. Nature. 2013 Jan;493(159).Available from: http://www.nature.com/nature/journal/v493/n7431/full/493159a.html