UK Web Focus (Brian Kelly)

Innovation and best practices for the Web

  • Email Subscription (Feedburner)

  • Twitter

    Posts on this blog cover ideas often discussed on Twitter. Feel free to follow @briankelly.

    Brian Kelly on Twitter Counter

  • Syndicate This Page

    RSS Feed for this page

    Licence

    Creative Commons License
    This work is licensed under a Creative Commons Attribution 2.0 UK: England & Wales License. As described in a blog post this licence applies to textual content published by the author and (unless stated otherwise) guest bloggers. Also note that on 24 October 2011 the licence was changed from CC-BY-SA to CC-BY. Comments posted on this blog will also be deemed to have been published with this licence. Please note though, that images and other resources embedded in the blog may not be covered by this licence.

    Contact Details

    Brian's email address is ukwebfocus@gmail.com. You can also follow him on Twitter using the ID briankelly. Also note that the @ukwebfocus Twitter ID provides automated alerts of new blog posts.

  • Contact Details

    My LinkedIn profile provides details of my professional activities.

    View Brian Kelly's profile on LinkedIn

    Also see my about.me profile.

  • Top Posts & Pages

  • Privacy

    Cookies

    This blog is hosted by WordPress.com which uses Google Analytics (which makes use of 'cookie' technologies) to provide the blog owner with information on usage of this blog.

    Other Privacy Issues

    If you wish to make a comment on this blog you must provide an email address. This is required in order to minimise comment spamming. The email address will not be made public.

An SEO Analysis of UK University Web Sites

Posted by Brian Kelly on 8 Feb 2012

Why Benchmark SEO For UK University Web Sites?

The recent JISC Grant funding 18/11: OER rapid innovation describes (in the PDF document) how this call is based on a conceptualisation of “open educational resources as a component of a wider field of ‘open academic practice’, encompassing the many ways in which higher education is engaging and sharing with wider online culture“. The paper goes on to remind bidders that “Effective Search Engine Optimisation is key to open educational resources providing benefits of discoverability, reach reputation and marketing“.

The JISC Call will be funding further developments of OER resources. But how easy will it be to find such resources, in light of the popularity of Google for finding resources? Or to put it another way, how Google-friendly are UK University Web sites? Are there examples of best practices which could be applied elsewhere in order to provide benefits across the UK higher education sector? And are there weaknesses which, if known about, could be addressed?

Recently an SEO Analysis of UK Russell Group University Home Pages Using Blekko was published on this blog which was followed by an Analysis of Incoming Links to Russell Group University Home Pages which also made use of Blekko. These surveys were carried out across the 20 Russell Group universities which describe themselves as being the “20 leading UK universities which are committed to maintaining the very best research, an outstanding teaching and learning experience and unrivalled links with business and the public sector“.

Having evaluated use of the tool across this sample (and spotting possible problem areas where university web sites may have multiple domain name and entry point variants) the next step was to make use of the tool across all UK university web sites.

The Survey

The survey began on 27 January 2012 using the Blekko search engine. A list of UK university web sites has been created within Blekko which automatically lists Blekko’s SEO rankings for the web sites. This data was added to a Google spreadsheet which was used to create the accompanying histogram.

It should be noted that the list of UK universities should not be regarded as a definitive list. There may be institutions included which should not be regarded as a UK university. In addition there may be a small number of institutions which may have been omitted from the analysis. The accompanying spreadsheet may be updated in light of feedback received.

Discussion

What can we learn from this data? The rankings for the top five institutions are given in Table 1. From these five institutions it might be useful to explore the reasons why these web sites are so highly ranked. [Note the Russell Group universities were inadvertently omitted from the list when this post was published. Details have been added and Table 1 has been updated].

Table 1: Top Five SEO Rankings according to Blekko
Ref. No. Institution Rank
(on 27 Jan 2012) 
 Current Blekko
Ranking
1 UCL 1,433.67 View
2 University of Liverpool 1,286.85 View
3 University of Leeds 1,284.97 View
4 Durham University 1,277.32 View
5 University of York 1,246.03 View

The embarrassing aspect of such comparisons lies in describing the web sites which are poorly ranked. Table 2 lists the SEO ranking figures for the lowest ranked institutional web sites. In addition to the table a screenshot of the table is also included, which was taken at 6 pm on Monday 6 February 2012 (note that the data and time shown in the image was the date the entry was added to the table).

Table 2: Bottom Five SEO Rankings according to Blekko
Ref. No. Institution Rank
(on 27 Jan 2012)
Current Blekko
Ranking
1 Trinity Saint David View
2 Cardiff Metropolitan University  – View
3 UWL (University of West London)  – View
4 UCP Marjon  28.91 View
5 De Montfort University  31.58 View

It should be noted that two of the three web sites for which no rank value was available were in Wales. This may suggest that technical decisions taken to provide bilingual web sites might adversely affect search engine rankings. Since such factors only affect Welsh institutions it would seem that these institutions will have a vested interested in identifying and implementing best practices for such web sites.

I must admit that I was surprised when I noticed a large institution such as De Montfort University listed in Table 2, with a Rank of 31.58. Viewing the detailed entry I found that a host rank value of 507.9 was given – very different from the rank of 31.58 which is listed in the table of all institutions.

Can We Trust the Findings?

Further investigation revealed further discrepancies between the entries in the overall list of UK universities and the detailed entries. In the process of creating listing for use with the Blekko service, listings for UK Russell Group universities (as well as the 1994 Group universities) were created.

Table 3 gives the Blekko Rank value together with the Host Rank value which is provided in the detailed entry for the web site. In addition the accompanying screenshot provides additional evidence of the findings as captured at 7 pm on 6 February 2012.

Table 3: Top Five SEO Rankings
for Russell Group Universities
Ref. No. Institution Rank
(on 6 Feb 2012)
Host Rank
(on 6 Feb 2012)
1 UCL 1,433.67 1,607.6
2 University of Liverpool 1,286.80 1,260.3
3 University of Leeds 1,284.97 1,141.8
4 LSE 1,224.59 1,201.1
5 University of Nottingham 1,138.99 1,382.9

Table 4 gives this information for the five Russell Group universities with the lowest SEO ranking values.

Table 4: Bottom Five SEO Rankings
for Russell Group Universities
Ref. No. Institution Rank
(on 6 Feb 2012)
Host Rank
(on 6 Feb 2012)
1 University of Birmingham   80.60 205.4
2 University of Sheffield 395.04 529.7
3 Imperial College 514.22 476.8
4 University of Manchester 610.86 694.2
5 Cardiff University 692.08 752.9

From these two tables we can see that there is are some disparities in the ranking order depending on which Rank value is used, but the numbers do not seem to be significantly different.

The Limitations of Closed Research

Initially I had envisaged that this post would help to identify examples of good and bad practices, which could be shared across the sector since, as described in the JISC called described above “Effective Search Engine Optimisation is key to open educational resources providing benefits of discoverability, reach reputation and marketing“. However it seems that gathering evidence of best practices is not necessarily easy, with the tools and techniques used for gathering evidence appearing to provide ambiguous or misleading findings.

This post illustrates the dangers of research which makes use of closed systems: we do not know the assumptions which the analytic tools are making, whether there are limitations in these assumptions or if there bugs in the implementation of the underlying algorithms.

These are reasons why open research approaches should be used, where possible. As described in WikipediaOpen research is research conducted in the spirit of free and open source software” which provides “clear accounts of the methodology freely available via the internet, along with any data or results extracted or derived from them“. The Blekko service initially appeared to support such open research practices since the web site states that “blekko doesn’t believe in keeping secrets“. However it subsequently became apparent that although Blekko may publish information about the SEO ranking for web sites, it does not describe how these rankings are determined.

It seems, as illustrated by a post which recently asked “How visible are UK universities in social media terms? A comparison of 20 Russell Group universities suggests that many large universities are just getting started“, that open research is not yet the norm in the analysis of web sites. The post describes:

Recent research by Horst Joepen from Searchmetrics [which] derives a ‘social media visibility’ score for 20 Russell Group universities, looking across their presence on Facebook, Twitter, Linked-in, Google+ and other media.

The econsultancy blog describes how:

The visibility score we use here is based on the total number of links a web domain has scored on the six social sites, Facebook, Twitter, LinkedIn, Google+, Delicious and StumbleUpon, while accounting for different weightings we give to links on individual social sites.

Image from eConsultancy blog

But what are these different weightings? And how valid is it to simply take this score and divide it by the size of the institutions (based on the number of staff and students) in order to provide the chart which, as illustrates, puts LSE as the clear leader?

It should be noted that this work is based on the analysis of:

roughly 207,900 links every week related to content on the websites of the Russell Group universities posted on Twitter, Facebook (likes, comments and shares), Linkedin, Google+ and social bookmarking sites StumbleUpon and Delicious. 

and is therefore not directly related to the SEO analysis addressed in this blog. This work is being referenced in order to reiterate the point of the dangers of closed research.

However the LSE Impact of Social Sciences blog, which hosted the post about this study, made the point that:

The LSE Impacts blog approach is that some data (no doubt with limitations) are better than none at all. 

I would agree with this view – it can be useful in gathering, analysing and visualising such data and in order to provide stories which interpret such findings. The Blekko analysis, for example, seems to be suggesting that the Universities of UCL, Liverpool, Leeds, Durham and York have implemented strategies which make their web site highly visible to search engines, but Trinity Saint David, Cardiff Metropolitan University and the University of West London seem to have implemented technical decisions which may act as barriers to search engines. The eConsultancy analysis, meanwhile, suggests that LSE’s approaches to use of social media services is particularly successful. But are such interpretations valid?

Unanswered Questions

Questions which need to be answered are:

  • How valid are the assumptions which are made which underpin the analysis?
  • How robust are the data collection and analysis services?
  • Are the findings corroborated by related surveys? (such as the survey of Facebook ‘likes’ for Russell Group universities described in a post which asked Is It Time To Ditch Facebook, When There’s Half a Million Fans Across Russell Group Universities?)
  • What relevance do the findings have to the related business purposes of the institutions?
  • What actions should institutions be taking in light of the findings and the answers to the first three questions?

What do you think?


Paradata: As described in a post on Paradata for Online Surveys blog posts which contain live links to data will include a summary of the survey environment in order to help ensure that survey findings are reproducible, with information on potentially misleading information being highlighted.

This survey was initially carried out over a period of a few weeks in January and February 2012 using Chrome on a Windows 7 PC and Safari on an Apple Macintosh. The survey was carried out using the Blekko web-based tool. A request was made to Blekko for more detailed information about their ranking scores and their harvesting strategies but the reply simply provided the limited information which is provided on the Blekko web site. Inconsistencies in the findings were noted and this information was submitted to the Blekko support email (and also via an online support form on the web site). However no response has been received.

The information on the survey of visibility of Russell Group universities on social media sites was based on posts published on the LSE Impact of Social Sciences and eConsultancy blogs.

Footnote: The findings for the Russell Group universities were omitted from the list of all UK universities when this post was initially published. The data has now been added and Table 1 and the associated histogram have been updated.


Twitter conversation from Topsy: [View]

10 Responses to “An SEO Analysis of UK University Web Sites”

  1. Hmmm…not sure I follow the validity of this at all Brian. You say that “a small number of institutions which may have been omitted from the analysis” in the main UK university list – in fact it excludes the three highest ranking of all which appear in the Russell Group list. The 1994 list kind of matches the all institutions list (at top anyway) but excludes Aberdeen altogether. If the numbers are to be believed the “top five” should actually be UCL (;-0), Liverpool, Leeds, Durham and York. As you say, difficult to know what’s really going on within these closed systems.

  2. Pat said

    It is a bit of a leap to go from SEO to finding – as one is cataloguing, and the other is a process to ensure your ranked highly in google.

    Would you prefer your learning materials to only be those which someone has artificially optimised so as to appear highly in google? Does that sound trustworthy? It doesn’t exactly feel scholarly.

    It also assumes that the sole providers of OER are Universities. If you do any google search, Google’s algorithm is modified to prefer Wikipedia (whose every page is an OER) so they feature quite highly.

    • Hi Pat
      The importance of SEO was mentioned in the recent JISC OER call, which I cited in the post.

      I would prefer that OERs are not difficult to found using Google, thus failing to exploit the investment made in the development of OERs.

      Clearly there will be many providers of OERs. I suspect that many commercial providers of OERs will have SEO strategies in place for their resources – we need to ensure that our web sites aren’t artificially optimised to appear low in Google rankings!

      • Pat said

        SEO for a website / brand (a university website) is different for SEO for a learning resource though – one is clearly competitive, one is more good practice. It’s a bit apples and oranges. Only one of the top 5 Universities (UCL) has any OER – whereas 2 of the bottom five do.

        It might be more apt to look at how most repository software is very poor for SEO – failure to provide well structured URLs, poor embedding of RDFa – this is probably a bigger issue. It is hard to have any SEO if your system can’t provide it.

        Fundamentally, you can’t find an OER via google, as it has no method for recording if a resource has an open licence.

    • There will be lots of different reasons why people search for content held on university web sites: to find educational resources is just one of many.

      You are correct that repository software may be poor for SEO – and this may be the case for other types of software, too such as CMSs.

      Note that users may not necessarily be looking for an OER – they may be looking for an educational resource, and not care about its licence. An OER could be more findability than other types of educational resources, as more sites may be linking to resources with CC licences – but there’s a need to understand other barriers to SEO.

  3. In response to interesting comments on this post and elsewhere a Polldaddy survey has been set up which enables anonymous comments to be given on the value of SEO surveys, personal views on their value and details on institutional activities in this area,

  4. This is great post..this seo analysis is very aweome…i think this is very useful for seo peoples…

  5. […] An SEO Analysis of UK University Web Sites […]

Leave a comment