UK Web Focus

Innovation and best practices for the Web

Archive for September 14th, 2011

Bath is the University of the Year! But What if Online Metrics Were Included?

Posted by Brian Kelly (UK Web Focus) on 14 September 2011

University of the Year

For the first time in a long, long time last weekend I bought the Sunday Times.  The reason for this was to read the Sunday Time’s announcement that the University of Bath has been identified as the University of the Year.

As someone who has worked and lived in Bath for almost 15 years I was very pleased with the news – but not as pleased, I suspect, as the Vice-Chancellor and members of the University’s Press Office which, of course, published a University news item with details of the announcement which informed us that:

The University of Bath has been awarded the title of ‘University of the Year 2011/12’ by The Sunday Times, one of the most prominent and influential newspapers in the world.

The news item went on to highlight another metric:

In that league table the University of Bath has risen to 5th out of 122 UK universities and colleges – its highest ever position.

Last Friday I viewed a video clip in which the Vice-Chancellor announced the news and as I left campus on Friday evening I noticed the posters which were scattered around the University Parade informing potential students (and their parents) which would be visiting the campus on the following day for the University Open Day  of what a great University they are visiting.

Being identified as the top University by (ahem) “one of the most prominent and influential newspapers in the world” is clearly deemed important by the powers that be at the University. And, in addition, several people I follow on Twitter who don’t work in marketing positions also tweeted the news.

What If Online Metrics Also Counted?

Yesterday Sheila MacNeill, Assistant Director at JISC CETIS, alerted me to a Mashable article which asked “How Digitally Connected Are the U.S. News Top 20 Colleges?”. The article referred to a  U.S. News list of top ranking national universities and national liberal arts colleges which appears similar to the Sunday Times survey. The Mashable article described how they:

decided to add another factor for review: social media connectedness. Below you’ll find top 10 lists of universities and liberal arts colleges alongside an analysis of their social media presences

This puts Harvard in equal first place with 66,737 Twitter followers, 698,933 Facebook likes and 390 YouTube videos and 27,786 subscribers. Harvard tied with Princeton University which had 15,572 Twitter followers, 52,125 Facebook likes and 164 YouTube videos and 2,978 subscribers. The positions in this league table seem to have been based on an undocumented weighting of the social media metrics.

Lies, Damned Lies and Social Media (and Other) Analytics

It is easy to dismiss the Mashable article as trivia, statistically flawed or dangerous, depending on your particular take. But can’t the same criticisms be made of the Sunday Times league tables?

Since the Sunday Times article is hidden behind a pay wall (and I’d left my copy at home) I subscribed to the Times / Sunday Times service in order to read about the methodology they had employed (note to self, cancel the Direct Debit payment before the full payment is due!).

The methodology (which is summarised here) states:

Universities were ranked according to marks scored in nine key performance areas.

Teaching excellence (250 points): The results of questions 1 to 12 of the 2011 national student survey (NSS) are scored taking a theoretical minimum and maximum score of 50% and 90% respectively. …

Student satisfaction (+50 to =55 points): The responses given to Question 22 of the National Student Survey: “Overall, I am satisfied with the quality of the course” were compared to a benchmark for the given institution, devised according to a formula based on the social and subject mix. …

Peer assessment (100 points): Academics across all institutions included in our guide were asked to rate departments in their subject field on a five-point scale for the quality of their undergraduate provision and a figure was awarded to each institution based on coverting (em>sic</em>) the average score for each institution on to a 100-point scale. …

Research quality (200 points): We used data from the most recent research assessment exercise, published in December 2008. Five different ratings were awarded for research quality, ranging from 4* to unclassified, from which we calculated an average score per member of staff entered for assessment. This average score was converted to a percentage and double weighted to give a score out of 200.  …

A-level/Higher points (250 points): Nationally audited data for the 2009-10 academic year were used for league table calculations. All entry points gained under the Ucas tariff system were used to calculate mean scores for all universities. Grades for leading qualifications were awarded points according to the following scale: A-levels – A*: 140; A:120, B:100, C:80, D:60 and E:40; AS-levels – A:60, B:50, C:40, D:30, E:20; Advanced Highers – A:120, B:100, C:80; Highers – A:72, B:60, C:48.  …

Unemployment (200 points): The number of students assumed to be unemployed six months after graduation was calculated as a percentage of the total number of known destinations. This is shown as a percentage in each profile. For the league table calculation, the percentage was subtracted from 50. …

Firsts/2:1s awarded (100): We calculated the percentage of students who graduated with firsts or 2:1 degrees.  …

Dropout rate (+57 to -74 points): The number of students who drop out before completing their courses was compared with the number expected to do so (the benchmark figure shown in brackets in the university profiles). Benchmarks vary according to subject mix and students’ entry qualifications. The percentage difference between the projected dropout rate and the benchmark was multiplied by five and awarded as a bonus/penalty mark. Universities that lost fewer students than their benchmark gained, those losing more had points deducted. …

Hmm. Are the ways in which the individual scores are compiled and then the scores for the nine categories aggregated significantly different from the way in which social media analytic companies such as Klout and Peerindex determine their scores (and which I summarised in a post on Social Analytics for Russell Group University Twitter Accounts)?

Doesn’t it seem likely that we will see the Sunday Times survey of UK universities in future years include analyses of universities’ online presence?

And won’t this be treated as important by those involved in University marketing and student recruitment, despite the limitations such methodologies may have?

Posted in Evidence | 9 Comments »