Last night saw the launch of the Open Access Button to coincide with worldwide Open Access week. The team behind the Open Access Button aim to help researchers, students and the general public access research papers that are behind paywalls and beyond their means.
The idea came from two medical students who were frustrated at not being able to access all the research they wanted to read, and finding the average cost to read a paywalled article was $30. Although the team has expanded to include partnerships with Cottage Labs, Jisc and more, there are still a large number of students donating their time to the project. Work began on the Button last year with a beta project that saw 5000 people hit almost 10,000 paywalls or denied access.
The new version of the Open Access Button is a plug-in for your browser that works as a button you click any time you cannot access an article due to a paywall. The system registers information about the article and your location to create a map of researchers who need access to information.
The Open Access Button will try to find a free to access version of the article, for example a pre-print deposited to an institutional or subject repository. If an alternative version cannot be found, the Button will then email the author to let them know that someone wants to access their research but can’t, and suggests the author deposits a copy in a repository.
Upon clicking the button, users are asked to enter a few sentences about why they want to read the article and what they could do if the research was available open access. The creators hope to use this information for open access advocacy, and to create stories that connect researchers, their work and readers around the world.
Keep up to date with the project on Twitter @OA_Button
Held at the Wellcome Collection in London and organised by Altmetric.com and the Wellcome Trust, this was the very first conference to focus solely on alternative metrics and their use by funders, universities and researchers.
The first day began with an introduction from seven different altmetrics providers to their products. Although similar, they each do something slightly different in how they measure their metrics and present them.
Below is a summary of the event, with a more comprehensive blog available from the organisers here.
How are people using altmetrics now?
During this session we heard from a range of stakeholders, including representatives from the Jisc funded project IRUS, a university-publisher collaborative project, and an academic who studies altmetrics as part of his research.
IRUS is using article level metrics to answer the question: are people using university repositories? The answer is yes, and IRUS can help repository managers to benchmark their repository contents and use. IRUS allows an institution to check the quality of its metadata, and also provides COUNTER compliant statistics that can be trusted.
Snowball Metrics is a university-driven and Elsevier-facilitated project that has produced a number of “recipes” designed to help universities use altmetrics for benchmarking. This takes metrics beyond the individual paper or researcher, and allows the university to assess a department as a whole. However altmetrics alone are not good enough to judge scholarly quality.
Finally Mike Thelwall, based at the University of Wolverhampton, presented his research group’s findings. Mike has been investigating how altmetrics relate to citation scores and overall has found a positive but weak correlation. Twitter seems to lead to more publicity for a paper, but doesn’t necessarily lead to more citations; Mendeley’s read count has a much stronger correlation with citations.
What’s going on in the communication of research?
This session gave us a great opportunity to hear from two active researchers on how they communicate their research to an academic audience and beyond. What was apparent was that Renée Hlozek, a postdoctoral researcher, had a lot more time to spend not only on actual research, but also on creative ways to communicate her research to a wider audience. For example, she is active on Twitter, blogs and is a current TED Senior Fellow.
As a professor, Bjorn Brembs spends more time on teaching and university administration. This means he struggles to find time to spend on promoting his research more widely, for example on social media. This is just one example of the importance of context when it comes to interpreting altmetrics. A researcher could find themselves with work of varying altmetric scores depending on the stage their career is at.
Impact assessment in the funding sector: the role of altmetrics
This session first heard from James Wilsdon, who is chairing the steering group on the role of metrics in research assessment for HEFCE. The group called for evidence from publishers, researchers and other stakeholders and received over 150 responses. There are loud voices both for and against altmetrics, and the full response would be published on the HEFCE website in early 2015.
Representatives from three different funders then spoke, including the Wellcome Trust, Science Foundation Ireland and the Association of Medical Research Charities. All three identified the need for researchers to show evidence of engagement with a wider audience and providing greater value for money. Altmetrics have the potential to give funders a lot more information about the research they fund by highlighting attention to articles before they are cited. However, Ruth Freeman from Science Foundation Ireland warned against using altmetrics in isolation, and Adam Dinsmore from Wellcome agreed that the altmetrics “score” is less important than the conversations happening online.
Altmetrics and publishers
The publishers who spoke identified what they saw as the two primary uses for altmetrics in publishing. First, they allow the author to track how popular their work is; second, altmetrics can help with discoverability. Both PLoS and Springer are planning to use altmetrics to create cross-journal highlights for specific subject areas, for example Neurostars from Springer.
The open access publisher PLoS was the first publisher to introduce article level metrics. Jennifer Lin explained that PLoS plan to do more to reveal the stories behind the numbers. To do this they need to advocate for improvements to article metadata, and see ORCID as something that will help disambiguate author information
Workshops
During the final session of the conference, we attempted to reach some final conclusions and also to think about what developments we would like to see in the future. There were three main points:
The need for standardisation was identified – there are a number of different organisations that are collecting and measuring alternative metrics. Some standardisation is necessary to ensure the results are comparable and trustworthy.
A lot of data is being collected, but there are a lot of improvements to be made in the interpretation and use of the data. The use of altmetrics by funders, REF, etc. should be as transparent as possible.
In all cases, the use of altmetrics should include a consideration of context, and should be used in creating a story of the impact that is followed from the lab to publication to policy implementation.
Altmetrics at Imperial
Symplectic and Spiral both feature altmetrics from Altmetric.com, displayed as a colourful “donut”. You can see an example in Spiral here. Clicking on the icon will take you to the Altmetric page for that article, where you can explore the Tweets and blogs that have mentioned it.