UK Web Focus

Innovation and best practices for the Web

Posts Tagged ‘solo12impact’

The ‘Altmetrics everywhere – but what are we missing?’ #solo12impact Session

Posted by Brian Kelly (UK Web Focus) on 10 November 2012

I’m looking forward to attending the session on “Altmetrics everywhere – but what are we missing?” which takes place on Monday at the SpotOn London (#SOLO12) conference.

In a post entitled Altmetrics everywhere – but what are we missing? #solo12impact #solo12impact Alan Cann, the workshop co-facilitator, has provided a taster for the session.  In the post Alan describes how:

In the last couple of years altmetrics (the creation and study of new metrics based on social media for analyzing and informing scholarship) have popped up across the web. 

Alan refers to a recent guest post on this blog entitled Social Media Analytics for R&D: a Catalan Vision which suggests a range of parameter which may be relevant. However Alan feels that:

The reality is that this is too complex for those of us with lives and jobs. We need services / dashboards to provide and digest this information.

I agree, the research community will need similar dashboards which can provide indications of engagement and outreach. Alan mentions a number of possible solutions. He is dismissive of Klout (which I would agree is not appropriate in our context although if you are an advertising agency and wish to decide which Twitter star to employ to post sponsored tweets this might provide useful information to assist the selection process). Alan is more positive about  Kred, but his preferred tool seems to be CrowdBooster. Alan’s post includes screen shots which illustrate the data visualisation provided by the tool.

I have also recently started to make use of Crowdbooster. However I feel that the dashboard provided by the Twentyfeet service is better.

The screen illustrates one of the dashboard views of  my Twitter engagement during October 2012.

However Twentyfeet (also known as 20ft.net) is not popular in some quarters as the free version sends a singly weekly tweet summarising the data over the previous week.

It is possible to disable this alert for a small annual fee (of, I think, ~$12 per year), although since this is only a single weekly tweet it is not be too intrusive.

I will be making comparisons between these services once Crowdbooster has aggregated a sufficient number of my tweets to make valid comparisons. For now I hope this contribution to the #solo12impact session will be of interest to the participants.


View Twitter conversation from: [Topsy]

Posted in Evidence | Tagged: , | 1 Comment »

Understanding the Limits of Altmetrics: Slideshare Statistics

Posted by Brian Kelly (UK Web Focus) on 8 November 2012

About AltMetrics

Cricketers like statistics, as we know from the long-standing popularity of Wisden, the cricketing almanack which was first published in 1854. Researchers have similar interests with, in many cases, their profession reputation being strongly influenced by statistics. For researchers the importance of citation data is now being complemented by a new range of metrics which are felt to be more relevant to today’s fat-moving digital environment, which are know as altmetrics. The altmetrics manifesto explains how:

Peer-review has served scholarship well, but is beginning to show its age. It is slow, encourages conventionality, and fails to hold reviewers accountable. 

and goes on to describe how:

Altmetrics expand our view of what impact looks like, but also of what’s making the impact. 

However the manifesto concludes with a note of caution:

Researchers must ask if altmetrics really reflect impact, or just empty buzz. Work should correlate between altmetrics and existing measures, predict citations from altmetrics, and compare altmetrics with expert evaluation. Application designers should continue to build systems to display altmetrics,  develop methods to detect and repair gaming, and create metrics for use and reuse of data. Ultimately, our tools should use the rich semantic data from altmetrics to ask “how and why?” as well as “how many?”

Altmetrics are in their early stages; many questions are unanswered. But given the crisis facing existing filters and the rapid evolution of scholarly communication, the speed, richness, and breadth of altmetrics make them worth investing in.

As I described in a post on “What Can Web Accessibility Metrics Learn From Alt.Metrics?” there can be a danger in uncritical acceptance of metrics. I therefore welcome this recognition of the need to explore the approaches which are currently being developed. In particular I am looking forward to the sessions on Altmetrics beyond the Numbers and Assessing social media impact which will be held at the Spot On London 2012 conference to be held in London on 11-12 November.  In a blog post entitled Altmetrics everywhere – but what are we missing? #solo12impact Alan Cann touches on the strengths and weaknesses of some of the well-known social analytics tools:

It astounds me that Klout continues to attract so much attention when it has been so thoroughly discredited - Gink is a more useful tool in my opinion ;-)

The best of this bunch is probably Kred, which at least has a transparent public algorithm. In reality, the only tool in this class I use is CrowdBooster, which has a number of useful functions.

But beyond Twitter analytics, what of metrics associated with the delivery of talks about one’s research activities? This is an area of interest to the Altmetrics community as can be seen from the development of the Impactstory service which “aggregates altmetrics: diverse impacts from your articles, datasets, blog posts, and more“. As described in the FAQ:

The system aggregates impact data from many sources and displays it in a single report, which is given a permaurl for dissemination and can be updated any time.

The service is intended for:

  • researchers who want to know how many times their work has been downloaded, bookmarked, and blogged
  • research groups who want to look at the broad impact of their work and see what has demonstrated interest
  • funders who want to see what sort of impact they may be missing when only considering citations to papers
  • repositories who want to report on how their research artifacts are being discussed
  • all of us who believe that people should be rewarded when their work (no matter what the format) makes a positive impact (no matter what the venue). Aggregating evidence of impact will facilitate appropriate rewards, thereby encouraging additional openness of useful forms of research output.

In addition to analysis of published articles, datasets, Web sites and software the service also aggregates slides hosted on Slideshare.

Metrics for Slideshare

Metrics for Slide Usage at Events

In May 2011 a post entitled Evidence of Slideshare’s Impact summarised use of slides hosted on Slideshare for talks which have been presented at UKOLN’s IWMW events from IWMW 2006 to IWMW 2010.

A year later, following a tweet in which @MattMay asked “Why does everybody ask for slides during/after a presentation? What do you do with them? I’m genuinely curious” I published an updated post on Trends in Slideshare Views for IWMW Events. In the post I suggested the following reasons for why speakers and event organisers may wish to host slides on Slideshare:

  • To enable a remote audience to view slides for a presentation they may be watching on a live video stream, on an audio stream or even simply listening to the tweets (and a provide a slide number on the slides to make it easier for people tweeting to identify the slide being used.
  • To enable the slides to be viewed in conjunction with a video recording of the presentation.
  • To enable my slides to be embedded elsewhere, so that the content can be reused in a blog post or on a web page.
  • To enable the content of the slides to be reused, if it is felt to be useful to others. Note that I provide a Creative Commons licence for the text of my slide, try to provide links to screenshots and give the origin of images which I may have obtained from others.
  • To enable slides to be viewed easily on a mobile device.
  • To provide a commentable facility for the slides.
  • To enable my slides to be related, via tags, to related slideshows.

The usage statistics for talks given at IWMW events in order to demonstrate the interest and accessing such slides in order to encourage speakers and workshop facilitators to make their slides available.  But beyond the motivations for event organisers, what of the individual speaker?

Metrics for Individuals

My interest in metrics for Slideshare date back to December 2010 when I published a post which asked What’s the Value of Using Slideshare? In August 2010  Steve Wheeler (@timbuckteeth) tweeted that:

Ironically there were 15 people in my audience for this Web 3.0 slideshow but >12,000 people have since viewed it http://bit.ly/cPfjjP

As can be seen, there have now been over 58,000 views of Steve’s slides on Web 3.0: The Way Forward?

In light of Steve’s experiences and the growing relevance of metrics for Slideshare suggested by the development of the Impactstory service, where a paper by myself, Martyn Cooper, David Sloan and Sarah Lewthwaite on “A Challenge to Web Accessibility Metrics and Guidelines: Putting People and Processes First” was accepted for the W4A 2012 conference earlier this year the co-authors agreed to ensure that our professional networks were made aware of the paper and the accompanying slides in order to maximise the numbers of downloads which, we hoped, would increase the numbers of citations in the future,  but also facilitate discussion around the ideas presented in the paper.

We monitored usage statistics for the slides and found that during the week of the conference there had been 1,391 views, compared with 3 and 351 views for other slides which used the #W4A2012 conference hashtag.  To date, as illustrated, there have been 7,603 views.

I used this example in a talk on Using Social Media to Promote ‘Good News’  which I gave at a one-day event organised by the AHRC (Arts and Humanities Research Council) which took place at the same time as the W4A 2012 conference. I was therefore able to observe how interest in the slides developed, which included use of the Topsy service. This service highlighted the following tweets:

stcaccess STC AccessAbilitySIG Influential
Enjoyed “Challenge to Web Accessibility Metrics & Guidelines” slides from @sloandr & Co. http://t.co/XOoQNnlo #w4a12 #a11y #metrics
04/17/2012 Reply Retweet Favorite 7 similar tweets
nethermind Elle Waters
We need more of this = #W4A slides by @martyncooper @briankelly @sloandr @slewth - Learner analytics & #a11y metrics: http://t.co/GHHfhLcv
04/19/2012 Reply Retweet Favorite 2 similar tweets
crpdisabilities Bill Shackleton Influential
A Challenge to Web #Accessibility Metrics & Guidelines: Putting People & Processes First #A11y #Presentation http://t.co/fehzsbDR
04/16/2012 Reply Retweet Favorite 2 similar tweets

I’ve used this example to illustrate how analysis of use of Twitter at conferences can help to see how people are engaging with talks. In this example the Twitter IDs STCAccess and CRPDisabilities indicated that those working in accessibility were engaging without paper and spreading the ideas across their networks.

Do the Numbers Add Up?

In a series of talks given during Open Access 2012 week I described the importance of social media in raising the visibility of research papers, including papers hosted on institutional repositories. However when I examined the statistics in more detail I realised that the numbers didn’t add up. According to Slideshare there have been 2,881 views of the slides from the post on A Challenge to Web Accessibility Metrics and Guidelines: Enhancing Access to Slides in which they had been embedded.However, as shown, there have only been 472 views of the blog post itself. Strange!

I subsequently realised that a Slideshare view will be recorded when the post is accessed, even if the individual slides are not viewed. And since the blog post will continue to be shown on the blog’s home page (ukwebfocus.wordpress.com) until 30 subsequent posts have been published, each time someone visited the home page between the 19 April (when the post was published) and 5 July 2012 (30 posts later) this would have seemingly have registered as a view of the slides- even though most users will not have scrolled down and seen even the title slide!
What, then, do Slideshare usage statistics tell us? Clearly if the slides have been embedded in a blog they don’t tell us how many people have viewed the slides – although if slides are not embedded elsewhere or have been embedded in a static Web page they may provide more indicative statistics. If the slides have been embedded in blog posts or other curated environments this might give an indication of the popularity of the containing blog or similar environment. In Steve Wheeler’s case the popularity of his slides provide evidence of the popularity of Steve’s Learning with E’s’ blog, the Damn Digital Chinese language blog, the Building e-Capability blog and the Scoop.it and paper.li curation services – together with a spam farm.

Lies, Damned Lies and Altmetrics?

Where does this leave services such as Impactstory? Looking at the Impactstory findings for my resources I can see that the slides for on a paper on “Accessibility 2.0: People, Policies and Processes” seem to be the most highly-ranked, with 73 downloads and 2,989 views.

But how many of those views were views of the slides, rather than the containing resources? And how many views way have taken as the result of views from a spam farm?

I don’t have answers to these questions or the bigger question of “Will the value of Altmetrics be undermined by the complex ways in which resources may be reused, misused or the systems gamed?

This is a question I hope will be addressed at the Spot On London 2012 conference.


View Twitter conversation from: [Topsy]

Posted in Events, Evidence | Tagged: , , | 10 Comments »