1:AM, the First Altmetrics Conference
As described in a post entitled Analytics Events: For Learning and For Research, the 1:AM conference, the first dedicated altmetrics conference took place in London last week.
This was a fascinating conference, with lively discussion taking place at the conference and on the #1amconf Twitter back channel.
The conference embraced event amplification technologies, with a number of remote speakers giving their talks using Google Hangouts and all of the plenary talks being live-streamed and made available on the conference’s YouTube channel.
With so much discussion taking place across a range of channels I created a Lanyrd entry for the conference and publicised it on the final day of the conference.
I’m pleased to say that many of the participants and event organisers used the Lanyrd page to provide access to the various reports on the sessions, access to slides used by the speakers and video recordings of the talks, photos of the event and archives of the discussions and arguments which took place on Twitter: at the time of writing links have been added to 35 separate resources.
Altmetrics as an Indicator of Quality or of Interest?
On the first morning of opening day of the conference in particular there were lively discussions on the value of altmetrics with Professor David Colquhoun (@David_Colquhoun) in particular being scathing in his criticisms:
To show that trivialises and corrupts science is to look at high scoring papers
The blog post on Why you should ignore altmetrics and other bibliometric nightmares mentioned in this tweet generated much discussion on the blog and elsewhere. For those with an interest in this area I recommend reading the post and the follow-up comments, such as this response from Euan Adie, founder of the Altmetric.com company:
Hi David. Thanks for writing the post! I founded Altmetric.com. I think you and Andrew have some fair points, but wanted to clear up the odd bit of confusion.
I think your underlying point about metrics is fair enough (I am happy to disagree quietly!). You’re conflating metrics, altmetrics and attention though.
Before anything else, to be absolutely, completely clear: I don’t believe that you can tell the quality of a paper from numbers (or tweets). The best way to determine the quality of a paper is to read it. I also happen to agree about post publication review and that too much hype harms science.
Euan concluded his comment by providing a link to his post which suggested that those with interests in the impact of scientific research to Broaden your horizons: impact doesn’t need to be all about academic citations.
The consensus at the conference seemed to be that the view (perhaps based on misunderstandings) that altmetrics would provide an alternative to citation analysis to determine the quality of research and should determine how research should be funding is no longer widely accepted; instead altmetrics are regarded as being complementary to citation data and can provide a broader picture, especially of how research is being discussed and debated.
Raising the Visibility of One’s Research: Kudos
In discussions with other participants I heard how the view that researchers (and funders of research) had responsibilities for raising the visibility of their research is becoming accepted: the view that only one’s peers need be interested in the research was felt to be no longer relevant. “We need to be seen to be able to justify funding for research“was one comment I heard.
Back in March 2012 in a post on Marketing for Scientists Martin Fenner made a similar point:
Scientists may feel uncomfortable about marketing their work, but we all are doing it already. We know that giving a presentation at a key meeting can be a boost for our career, and we know about the importance of maintaining an academic homepage listing our research interests and publications. And people reading this blog will understand that a science blog can be a powerful marketing tool.
But if researchers are now accepted the need to raise the visibility of their research, the question then is what tools can they use to support this goal?
The session on Altmetrics in the last year and what’s on the roadmap provided brief summaries about altmetrics application including talks about Altmetric, Plum Analytics, Impactstory, PLOS, Mendeley, Open Access Scholarly Publishing Association and Kudos.
Kudos was the one tool which was new to me. A recent post which describes how Kudos Integrates Altmetric Data to Help Researchers see Online Dissemination of Articles summarised the aim of the service:
Kudos is a new service designed to help scholars and their institutions increase the impact of their published research articles. Altmetric tracks and collates mentions of research articles on social media, blogs, news outlets and other online sources. This integration means mentions are now incorporated on the Kudos metrics pages for individual authors, and accompanied by a short summary which further details the number of mentions per source. Each article is assigned a score based on the amount of attention it has received to date, and authors are able to click through to see a sample of the original mentions of their article.
I have created an account on Kudos. I was able to quickly claim many of my research papers. As can be seen from the screenshot of the dashboard a number of my papers already have an Altmetric score, which is defined as “a reflection of the amount of interest your publication has attracted across news outlets and social media“.
My paper on Accessibility 2.0: Next Steps for Web Accessibility, for example, has an Altmetrics score of 6. If I wanted to raise the visibility and impact of the paper the Kudos tool allows me to:
Explain: Explain your work and tell readers what it’s about and why it’s important.
Enrich: Enrich your publication by adding links to related materials.
Share: Share a link to your publication by email and social media.
Measure: Measure the impact on your publication performance.
Raising the Visibility of One’s Research: Wikipedia
In a recent post entitled Wikimedia and Metrics: A Poster for the 1:AM Altmetrics Conference I described the metrics for Wikipedia articles which may provide indications of the effectiveness of the outreach of the article. The post summarised a poster which was displayed at the conference and which is shown in this post.
As may be shown by usage metrics, Wikipedia can provide a mechanism for raising the visibility of topics described in Wikipedia articles, which can include articles based on research work.
It would appear that Kudos and Wikipedia both provide mechanisms for enhancing interest in research work. But these two tools provide contrasting approaches to the way they support such dissemination work.
With Kudos, authors of research papers are expected to provide summaries of their work. by (a) adding a short title to the publication to help make it easier to find and can help increase citations; (b) adding a simple, non-technical explanation of your publication will make it easier to find, and more accessible to a broader audience and (c) adding an explanation of what is most unique and/or timely about your work, and the difference it might make, will help increase readership.
In contrast, content added to Wikipedia should be provided based on the fundamental principles of Wikipedia , known as the five pillars. In brief:
- Wikipedia is an encyclopedia: It combines many features of general and specialized encyclopedias, almanacs, and gazetteers. Wikipedia is not a soapbox, an advertising platform, a vanity press, an experiment in anarchy or democracy, an indiscriminate collection of information, or a web directory.
- Wikipedia is written from a neutral point of view: We strive for articles that document and explain the major points of view, giving due weight with respect to their prominence in an impartial tone. We avoid advocacy and we characterize information and issues rather than debate them.
- Wikipedia is free content that anyone can use, edit, and distribute: Since all editors freely license their work to the public, no editor owns an article and any contributions can and will be mercilessly edited and redistributed. Respect copyright laws, and never plagiarize from sources.
- Editors should treat each other with respect and civility: Respect your fellow Wikipedians, even when you disagree. Apply Wikipedia etiquette, and don’t engage in personal attacks. Seek consensus, avoid edit wars, and never disrupt Wikipedia to illustrate a point.
- Wikipedia has no firm rules: Wikipedia has policies and guidelines, but they are not carved in stone; their content and interpretation can evolve over time. Their principles and spirit matter more than their literal wording, and sometimes improving Wikipedia requires making an exception.
The second of these principles, which expects Wikipedia articles to be written from a neutral point of view, will be the most challenging for researchers who would like to use Wikipedia to raise the visibility of their research to a wider audience. One of three core content policies for Wikipedia articles is that, content should be provided from a neutral point of view – and it will be difficult to do this if you wish to publish or cite content based on one’s own research. Another challenge for researchers is a second core content policy which states that Wikipedia articles must not contain original research.
What Is To Be Done?
Perhaps a simple approach which could be made by open researchers who are willing to share their experiences openly would be ensure that initial desktop research which typically may be used as a literature review is used to support existing articles.
However the bigger challenge is to address the tensions between the funders’ requirement to ensure that research they fund is widely disseminated and exploited by others and Wikipedia’s requirement for neutrality.
In a recent post on Links From Wikipedia to Russell Group University Repositories I highlighted similar challenges for universities which may be tempted to seek to exploit the SEO benefits which links from Wikipedia to institutional web pages may provide.
In the blog post I cited an article from the PR community who had recognised the dangers that PR companies can be easily tempted to provide links to clients’ web sites for similar reasons. In response to concerns raised by the Wikipedia community Top PR Firms Promise[d] They Won’t Edit Clients’ Wikipedia Entries on the Sly. The article,which is hosted on Wikipedia, describes the Statement on Wikipedia from participating communications firms . The following statement was issued in 10 June 2014:
On behalf of our firms, we recognize Wikipedia’s unique and important role as a public knowledge resource. We also acknowledge thattheprior actions of some in our industry have led to a challenging relationshipwiththe community of Wikipedia editors.Our firms believe that it is in the best interest of our industry, and Wikipedia users at large, that Wikipedia fulfill its mission of developing anaccurate andobjective online encyclopedia. Therefore, it is wise for communications professionals to follow Wikipedia policies as part of ethical engagement practices.We therefore publicly state and commit, on behalf of our respective firms, to the best of our ability, to abide by the following principles:
- To seek to better understand the fundamental principles guiding Wikipedia and other Wikimedia projects.
- To act in accordance with Wikipedia’s policies and guidelines, particularly those related to “conflict of interest.”
- To the extent we become aware of potential violations of Wikipedia policies by our respective firms, to investigate the matter and seek corrective action, as appropriate and consistent with our policies.
- Beyond our own firms, to take steps to publicize our views and counsel our clients and peers to conduct themselves accordingly.
We also seek opportunities for a productive and transparent dialogue with Wikipedia editors, inasmuch as we can provide accurate, up-to-date, and verifiable information that helps Wikipedia better achieve its goals.
A significant improvement in relations between our two communities may not occur quickly or easily, but it is our intention to do what we can to create a long-term positive change and contribute toward Wikipedia’s continued success.
Might research councils and other funders of research find it useful to embrace similar principles? And is there a role for research librarians and others with responsibilities for supporting members of the research community in developing similar guidelines which will help ensure that researchers make use of Wikipedia in a way which supports the Wikipedia principles which have helped to ensure that the encyclopedia is regarded as a valuable source of information?
View Twitter conversations and metrics using: [Topsy]