UK Web Focus

Innovation and best practices for the Web

Archive for February 8th, 2012

An SEO Analysis of UK University Web Sites

Posted by Brian Kelly (UK Web Focus) on 8 February 2012

Why Benchmark SEO For UK University Web Sites?

The recent JISC Grant funding 18/11: OER rapid innovation describes (in the PDF document) how this call is based on a conceptualisation of “open educational resources as a component of a wider field of ‘open academic practice’, encompassing the many ways in which higher education is engaging and sharing with wider online culture“. The paper goes on to remind bidders that “Effective Search Engine Optimisation is key to open educational resources providing benefits of discoverability, reach reputation and marketing“.

The JISC Call will be funding further developments of OER resources. But how easy will it be to find such resources, in light of the popularity of Google for finding resources? Or to put it another way, how Google-friendly are UK University Web sites? Are there examples of best practices which could be applied elsewhere in order to provide benefits across the UK higher education sector? And are there weaknesses which, if known about, could be addressed?

Recently an SEO Analysis of UK Russell Group University Home Pages Using Blekko was published on this blog which was followed by an Analysis of Incoming Links to Russell Group University Home Pages which also made use of Blekko. These surveys were carried out across the 20 Russell Group universities which describe themselves as being the “20 leading UK universities which are committed to maintaining the very best research, an outstanding teaching and learning experience and unrivalled links with business and the public sector“.

Having evaluated use of the tool across this sample (and spotting possible problem areas where university web sites may have multiple domain name and entry point variants) the next step was to make use of the tool across all UK university web sites.

The Survey

The survey began on 27 January 2012 using the Blekko search engine. A list of UK university web sites has been created within Blekko which automatically lists Blekko’s SEO rankings for the web sites. This data was added to a Google spreadsheet which was used to create the accompanying histogram.

It should be noted that the list of UK universities should not be regarded as a definitive list. There may be institutions included which should not be regarded as a UK university. In addition there may be a small number of institutions which may have been omitted from the analysis. The accompanying spreadsheet may be updated in light of feedback received.

Discussion

What can we learn from this data? The rankings for the top five institutions are given in Table 1. From these five institutions it might be useful to explore the reasons why these web sites are so highly ranked. [Note the Russell Group universities were inadvertently omitted from the list when this post was published. Details have been added and Table 1 has been updated].

Table 1: Top Five SEO Rankings according to Blekko
Ref. No. Institution Rank
(on 27 Jan 2012) 
 Current Blekko
Ranking
1 UCL 1,433.67 View
2 University of Liverpool 1,286.85 View
3 University of Leeds 1,284.97 View
4 Durham University 1,277.32 View
5 University of York 1,246.03 View

The embarrassing aspect of such comparisons lies in describing the web sites which are poorly ranked. Table 2 lists the SEO ranking figures for the lowest ranked institutional web sites. In addition to the table a screenshot of the table is also included, which was taken at 6 pm on Monday 6 February 2012 (note that the data and time shown in the image was the date the entry was added to the table).

Table 2: Bottom Five SEO Rankings according to Blekko
Ref. No. Institution Rank
(on 27 Jan 2012)
Current Blekko
Ranking
1 Trinity Saint David - View
2 Cardiff Metropolitan University  - View
3 UWL (University of West London)  - View
4 UCP Marjon  28.91 View
5 De Montfort University  31.58 View

It should be noted that two of the three web sites for which no rank value was available were in Wales. This may suggest that technical decisions taken to provide bilingual web sites might adversely affect search engine rankings. Since such factors only affect Welsh institutions it would seem that these institutions will have a vested interested in identifying and implementing best practices for such web sites.

I must admit that I was surprised when I noticed a large institution such as De Montfort University listed in Table 2, with a Rank of 31.58. Viewing the detailed entry I found that a host rank value of 507.9 was given – very different from the rank of 31.58 which is listed in the table of all institutions.

Can We Trust the Findings?

Further investigation revealed further discrepancies between the entries in the overall list of UK universities and the detailed entries. In the process of creating listing for use with the Blekko service, listings for UK Russell Group universities (as well as the 1994 Group universities) were created.

Table 3 gives the Blekko Rank value together with the Host Rank value which is provided in the detailed entry for the web site. In addition the accompanying screenshot provides additional evidence of the findings as captured at 7 pm on 6 February 2012.

Table 3: Top Five SEO Rankings
for Russell Group Universities
Ref. No. Institution Rank
(on 6 Feb 2012)
Host Rank
(on 6 Feb 2012)
1 UCL 1,433.67 1,607.6
2 University of Liverpool 1,286.80 1,260.3
3 University of Leeds 1,284.97 1,141.8
4 LSE 1,224.59 1,201.1
5 University of Nottingham 1,138.99 1,382.9

Table 4 gives this information for the five Russell Group universities with the lowest SEO ranking values.

Table 4: Bottom Five SEO Rankings
for Russell Group Universities
Ref. No. Institution Rank
(on 6 Feb 2012)
Host Rank
(on 6 Feb 2012)
1 University of Birmingham   80.60 205.4
2 University of Sheffield 395.04 529.7
3 Imperial College 514.22 476.8
4 University of Manchester 610.86 694.2
5 Cardiff University 692.08 752.9

From these two tables we can see that there is are some disparities in the ranking order depending on which Rank value is used, but the numbers do not seem to be significantly different.

The Limitations of Closed Research

Initially I had envisaged that this post would help to identify examples of good and bad practices, which could be shared across the sector since, as described in the JISC called described above “Effective Search Engine Optimisation is key to open educational resources providing benefits of discoverability, reach reputation and marketing“. However it seems that gathering evidence of best practices is not necessarily easy, with the tools and techniques used for gathering evidence appearing to provide ambiguous or misleading findings.

This post illustrates the dangers of research which makes use of closed systems: we do not know the assumptions which the analytic tools are making, whether there are limitations in these assumptions or if there bugs in the implementation of the underlying algorithms.

These are reasons why open research approaches should be used, where possible. As described in WikipediaOpen research is research conducted in the spirit of free and open source software” which provides “clear accounts of the methodology freely available via the internet, along with any data or results extracted or derived from them“. The Blekko service initially appeared to support such open research practices since the web site states that “blekko doesn’t believe in keeping secrets“. However it subsequently became apparent that although Blekko may publish information about the SEO ranking for web sites, it does not describe how these rankings are determined.

It seems, as illustrated by a post which recently asked “How visible are UK universities in social media terms? A comparison of 20 Russell Group universities suggests that many large universities are just getting started“, that open research is not yet the norm in the analysis of web sites. The post describes:

Recent research by Horst Joepen from Searchmetrics [which] derives a ‘social media visibility’ score for 20 Russell Group universities, looking across their presence on Facebook, Twitter, Linked-in, Google+ and other media.

The econsultancy blog describes how:

The visibility score we use here is based on the total number of links a web domain has scored on the six social sites, Facebook, Twitter, LinkedIn, Google+, Delicious and StumbleUpon, while accounting for different weightings we give to links on individual social sites.

Image from eConsultancy blog

But what are these different weightings? And how valid is it to simply take this score and divide it by the size of the institutions (based on the number of staff and students) in order to provide the chart which, as illustrates, puts LSE as the clear leader?

It should be noted that this work is based on the analysis of:

roughly 207,900 links every week related to content on the websites of the Russell Group universities posted on Twitter, Facebook (likes, comments and shares), Linkedin, Google+ and social bookmarking sites StumbleUpon and Delicious. 

and is therefore not directly related to the SEO analysis addressed in this blog. This work is being referenced in order to reiterate the point of the dangers of closed research.

However the LSE Impact of Social Sciences blog, which hosted the post about this study, made the point that:

The LSE Impacts blog approach is that some data (no doubt with limitations) are better than none at all. 

I would agree with this view – it can be useful in gathering, analysing and visualising such data and in order to provide stories which interpret such findings. The Blekko analysis, for example, seems to be suggesting that the Universities of UCL, Liverpool, Leeds, Durham and York have implemented strategies which make their web site highly visible to search engines, but Trinity Saint David, Cardiff Metropolitan University and the University of West London seem to have implemented technical decisions which may act as barriers to search engines. The eConsultancy analysis, meanwhile, suggests that LSE’s approaches to use of social media services is particularly successful. But are such interpretations valid?

Unanswered Questions

Questions which need to be answered are:

  • How valid are the assumptions which are made which underpin the analysis?
  • How robust are the data collection and analysis services?
  • Are the findings corroborated by related surveys? (such as the survey of Facebook ‘likes’ for Russell Group universities described in a post which asked Is It Time To Ditch Facebook, When There’s Half a Million Fans Across Russell Group Universities?)
  • What relevance do the findings have to the related business purposes of the institutions?
  • What actions should institutions be taking in light of the findings and the answers to the first three questions?

What do you think?


Paradata: As described in a post on Paradata for Online Surveys blog posts which contain live links to data will include a summary of the survey environment in order to help ensure that survey findings are reproducible, with information on potentially misleading information being highlighted.

This survey was initially carried out over a period of a few weeks in January and February 2012 using Chrome on a Windows 7 PC and Safari on an Apple Macintosh. The survey was carried out using the Blekko web-based tool. A request was made to Blekko for more detailed information about their ranking scores and their harvesting strategies but the reply simply provided the limited information which is provided on the Blekko web site. Inconsistencies in the findings were noted and this information was submitted to the Blekko support email (and also via an online support form on the web site). However no response has been received.

The information on the survey of visibility of Russell Group universities on social media sites was based on posts published on the LSE Impact of Social Sciences and eConsultancy blogs.

Footnote: The findings for the Russell Group universities were omitted from the list of all UK universities when this post was initially published. The data has now been added and Table 1 and the associated histogram have been updated.


Twitter conversation from Topsy: [View]

Posted in Evidence, search | 9 Comments »