Altmetric Attention Score
Can apply to: Journal articles, books, and any research output deposited to a repository that the company tracks (e.g. Figshare, Zenodo, or an institutional repository)
Metric definition: “The Altmetric Attention Score is an automatically calculated, weighted count of all of the attention a research output has received [online, in sources tracked by Altmetric].”
Metric calculation: The AAS takes into account the volume of attention received by a research output across a number of online attention sources (e.g. Twitter, Pubpeer, etc). Each source is weighted by the company. The AAS weighting also takes into account whether the author of a mention of a research output regularly posts about scholarly articles.
Data sources: News articles, Blogs, Twitter, Facebook, Sina Weibo, Wikipedia, Policy Documents (per source), Q&A, F1000, Publons, Pubpeer, YouTube, Reddit, Pinterest, LinkedIn, Open Syllabus, Google+. Although the Altmetric article page shows Mendeley readers, Scopus citation counts and CiteULike bookmarks, these particular data do not count towards the score.
Appropriate use cases: The AAS is best used by individual researchers to understand the overall volume of attention that research has received online. Individuals may also use the “Score in Context” (found on Altmetric details pages) to understand how a research output’s score compares to other scores. The AAS may also be used by publishers and institutions to group the attention received by its published and/or produced research in order to monitor and benchmark its reach.
Limitations: The AAS does not take into account the sentiments of mentions made about research objects, and thus does not help one understand the positive nor negative attention that a piece of research has received. According to Lockwood (2016), “Article titles with result-oriented positive framing and more interesting phrasing receive higher Altmetric [attention] scores”, and the same goes with articles with catchy titles (Poplasen & Grgic, 2016). Conflicting research exists as to whether the number of collaborators on a paper may increase or decrease an AAS (Didegah, 2016; Haustein, Costas & Larivière, 2015). International collaborations may increase an AAS (Didegah, 2016). Institutional prestige of authors reportedly does not affect an AAS (Didegah, 2016). Legitimate self-promotion by authors may artificially increase an AAS (Adie, 2013). Journal impact factor and article accessibility may positively influence an article’s AAS; “publications from Social Sciences & Humanities have more mentions on Twitter and Facebook…[thus impacting an article’s AAS] than publications from both Engineering & Technology and Medical & Natural sciences” (Didegah, 2016). Differences in coverage and frequency of updates influence differences in counts of altmetric indicators (Bar-Ilan & Halevi, 2017). Studies show that there is very little overlap between very highly cited papers and those that receive high altmetric scores (Banshal et al, 2018; Poplasen & Grgic, 2016).
Inappropriate use cases: The AAS should not be used as a direct measure of research impact or quality of any kind.
Available metric sources: The AAS can be found in all products offered by Altmetric, including the free researcher bookmarklet and on many journal publisher websites and repositories (such as Figshare). The Dimensions database also includes the Altmetric Attention Score for the articles it indexes.
Transparency: To date, it is not possible to fully audit the AAS, as the weighting of the score depends upon non-public, company-assigned “tiers” for news sources, Twitter users, and some other sources that mention a research output.
Timeframe: Mostly post-2011. For more information on the coverage dates for sources contributing to the AAS, visit the Altmetric website.