Tag: social media

Are Imperial publications gaining attention on Bluesky?

This post is authored by Yusuf Ozkan, Research Outputs Analyst, and Dr Hamid Khan, Open Research Manager: Academic Engagement.

Researchers increasingly use social media to communicate their research. They share links to journal articles, but also other types of output like preprints, graphics/figures and lay summaries.  

That enables us to measure alternative indicators of research visibility beyond citations of, and in, journal articles. With many services like X, Mastodon, Threads and LinkedIn, researchers and the public are scattered across the social media world, which makes tracking visibility difficult. Bluesky has joined the club recently and is growing rapidly. In this post, we highlight how research-related conversations and citations of Imperial outputs have increased on Bluesky, emphasising the value of using the Library’s tools to track citations on social media.  

Although Bluesky is a relatively new platform – launched in 2023 as an invitation-only service – it has reached nearly 30 million users at the time of writing. The number of users increased by seven million in just six weeks from November 2024.  

Many people have migrated from X (formerly Twitter) to Bluesky during this period, partly following the US election, but the reasons for migration are not limited to politics. Bluesky also surpassed Threads in website user numbers. The rapid increase in users and growing trend of researchers joining Bluesky is making it an increasingly convenient forum for research conversations. 

Given the increase in users, we would expect to see research outputs being shared more widely on Bluesky. But it is extremely difficult, if not impossible, to manually measure that. This is where Altmetric comes into play, to track mentions of outputs.  

Altmetric is a tool for providing data about online attention to research by identifying mentions of research outputs on social media, blog sites, Wikipedia, news outlets and more. Altmetric donuts and badges display an attention score summarising all the online engagement with a scholarly publication. Altmetric can be useful to show societal visibility and impact, though its limitations should also be kept in mind. Imperial Library has a subscription to Altmetric. We can use Altmetric to see how social media users interact with Imperial’s research outputs. It’s one of many tools we use to support researchers to move away from journal-based metrics for evaluating the reach of their work. 

The migration of researchers away from X prompted Altmetric to start to monitor emerging platforms, leading to the inclusion of Bluesky in Altmetric statistics in December 2024, although the platform had been picking up citations on Bluesky since October.  

There are nearly 400 thousand Bluesky posts citing a research paper from late October to mid-January – less than just three months, which is a significant milestone considering it took Twitter nine years to reach 300 thousand posts linking a research paper. 

Altmetric picked up a dramatic global increase in mentions of research outputs on Bluesky from November 2024
Altmetric picked up a dramatic global increase in mentions of research outputs on Bluesky from November 2024

Bluesky is a rising star for research conversations online, but what is the situation when it comes to mentions of Imperial research outputs? Well, the trend is no different from the overall picture.

Altmetric identifies over 11,000 mentions of publications on Bluesky associated with Imperial authors from November 2024 to January 2025. The number of Imperial output mentions on X is four times higher than Bluesky for the same period. Given that Bluesky has been launched recently and has ten times fewer users than X, the figure is still substantial.

Bluesky is the second-most-referenced source type after X for research outputs tracked by Altmetric, November 2024 – January 2025.
Bluesky is the second-most-referenced source type after X for research outputs tracked by Altmetric, November 2024 – January 2025.

The mentions of Imperial publications on Bluesky followed a similar trend to the overall mentions of research outputs on the platform. There was a massive uptick in mid-November 2024, taking the number from a few mentions to thousands per week. Although the number of mentions appears to be coming down, the increasing number of overall Bluesky users and posts suggests citations are not likely to return to their pre-November level.

Massive uptick in mentions of Imperial publications on Bluesky from mid-November 2024
Massive uptick in mentions of Imperial publications on Bluesky from mid-November 2024

Comparing mentions on Bluesky with X for all time gives us another perspective on how sharing practices have changed. The number of X mentions for Imperial outputs has consistently decreased since 2021 from 620K mentions to 270K in 2024. If this trend continues, we expect to see just over 100K mentions in 2025.

Mentions of Imperial research outputs on X peaked in 2021 and have plummeted ever since
Mentions of Imperial research outputs on X peaked in 2021 and have plummeted ever since

Even though Bluesky is just two years old and Altmetric have been including mentions on the platform for three months, the volume of mentions is impressive.

Bluesky is a new social media platform whose users are increasing. The volume of research-related conversation on Bluesky has increased since October 2024, making it the second-largest data source tracked by Altmetric over the past three months. Imperial research outputs are widely shared on the platform, too, with the citation of over 10K for the same period. But there is a note of caution.

Social media is great for increasing visibility and reach. It can be a good way to encourage open and collaborative peer review, and ultimately help improve quality and impact. However, metrics provided by platforms like Altmetric can be misleading, as they don’t track everything happening on the internet. For example, Altmetric only includes historical data for LinkedIn. Current mentions are not tracked despite the presence of many researchers on LinkedIn.

Social media platforms have some biases, such as vulnerability to manipulation and gaming (just like the Journal Impact Factor), imbalanced user demographics, and either over- or under-representation of an academic discipline on one platform. Counting citations is a risky business, because social media mentions do not necessarily point to positive impact or high quality. Someone could be critiquing or rebutting your work in citing it. Despite limitations, diverse platforms for sharing research are good for discoverability, since one user of a platform may not use another. This increases the potential impact of research by reaching diverse audiences. Bluesky is a recent and promising example, demonstrating how emerging platforms can broaden the reach and visibility of research publications.

To see how your research is being seen and cited on social media, you can make use of the Library’s subscription to Altmetric. Get in touch with the Bibliometrics service to discuss ways to measure the visibility and impact of your work other than the flawed Journal Impact Factor.

Note: This post was authored in mid-January. Therefore, some of the figures might have changed by the time of publication.

1:AM London Altmetrics Conference 25-26 September 2014

Held at the Wellcome Collection in London and organised by Altmetric.com and the Wellcome Trust, this was the very first conference to focus solely on alternative metrics and their use by funders, universities and researchers.

The first day began with an introduction from seven different altmetrics providers to their products. Although similar, they each do something slightly different in how they measure their metrics and present them.

Below is a summary of the event, with a more comprehensive blog available from the organisers here.

Altmetrics, by  AJ Cann https://www.flickr.com/photos/ajc1/6795008004 Licensed CC BY SA 2.0
Altmetrics, by AJ Cann. Licensed CC BY SA 2.0

How are people using altmetrics now?

During this session we heard from a range of stakeholders, including representatives from the Jisc funded project IRUS, a university-publisher collaborative project, and an academic who studies altmetrics as part of his research.

IRUS is using article level metrics to answer the question: are people using university repositories? The answer is yes, and IRUS can help repository managers to benchmark their repository contents and use. IRUS allows an institution to check the quality of its metadata, and also provides COUNTER compliant statistics that can be trusted.

Snowball Metrics is a university-driven and Elsevier-facilitated project that has produced a number of “recipes” designed to help universities use altmetrics for benchmarking. This takes metrics beyond the individual paper or researcher, and allows the university to assess a department as a whole. However altmetrics alone are not good enough to judge scholarly quality.

Finally Mike Thelwall, based at the University of Wolverhampton, presented his research group’s findings. Mike has been investigating how altmetrics relate to citation scores and overall has found a positive but weak correlation. Twitter seems to lead to more publicity for a paper, but doesn’t necessarily lead to more citations; Mendeley’s read count has a much stronger correlation with citations.

What’s going on in the communication of research?

This session gave us a great opportunity to hear from two active researchers on how they communicate their research to an academic audience and beyond. What was apparent was that Renée Hlozek, a postdoctoral researcher, had a lot more time to spend not only on actual research, but also on creative ways to communicate her research to a wider audience. For example, she is active on Twitter, blogs and is a current TED Senior Fellow.

As a professor, Bjorn Brembs spends more time on teaching and university administration. This means he struggles to find time to spend on promoting his research more widely, for example on social media. This is just one example of the importance of context when it comes to interpreting altmetrics. A researcher could find themselves with work of varying altmetric scores depending on the stage their career is at.

Impact assessment in the funding sector: the role of altmetrics

This session first heard from James Wilsdon, who is chairing the steering group on the role of metrics in research assessment for HEFCE. The group called for evidence from publishers, researchers and other stakeholders and received over 150 responses. There are loud voices both for and against altmetrics, and the full response would be published on the HEFCE website in early 2015.

Representatives from three different funders then spoke, including the Wellcome Trust, Science Foundation Ireland and the Association of Medical Research Charities. All three identified the need for researchers to show evidence of engagement with a wider audience and providing greater value for money. Altmetrics have the potential to give funders a lot more information about the research they fund by highlighting attention to articles before they are cited. However, Ruth Freeman from Science Foundation Ireland warned against using altmetrics in isolation, and Adam Dinsmore from Wellcome agreed that the altmetrics “score”  is less important than the conversations happening online.

Altmetrics and publishers

The publishers who spoke identified what they saw as the two primary uses for altmetrics in publishing. First, they allow the author to track how popular their work is; second, altmetrics can help with discoverability. Both PLoS and Springer are planning to use altmetrics to create cross-journal highlights for specific subject areas, for example Neurostars from Springer.

The open access publisher PLoS was the first publisher to introduce article level metrics. Jennifer Lin explained that PLoS plan to do more to reveal the stories behind the numbers. To do this they need to advocate for improvements to article metadata, and see ORCID as something that will help disambiguate author information

Workshops

During the final session of the conference, we attempted to reach some final conclusions and also to think about what developments we would like to see in the future. There were three main points:

  1. The need for standardisation was identified – there are a number of different organisations that are collecting and measuring alternative metrics. Some standardisation is necessary to ensure the results are comparable and trustworthy.
  2. A lot of data is being collected, but there are a lot of improvements to be made in the interpretation and use of the data. The use of altmetrics by funders, REF, etc. should be as transparent as possible.
  3. In all cases, the use of altmetrics should include a consideration of context, and should be used in creating a story of the impact that is followed from the lab to publication to policy implementation.

Altmetrics at Imperial

Symplectic and Spiral both feature altmetrics from Altmetric.com, displayed as a colourful “donut”. You can see an example in Spiral here. Clicking on the icon will take you to the Altmetric page for that article, where you can explore the Tweets and blogs that have mentioned it.