UK Web Focus (Brian Kelly)

Innovation and best practices for the Web

Archive for March, 2008

Disappearing Public Sector Web Sites

Posted by Brian Kelly on 31 March 2008

I recently used the Intute service to see what records it held about UKOLN’s activities. I found a record about the ‘Crossroads West Midlands service which UKOLN provided technical advice on the design of the collection description database:

This is the website of ‘Crossroads West Midlands’, a Resource funded project that is working to develop online access to the collections of libraries, museums and archives in the West Midlands (including universities and local authorities as well as private institutions). The Crossroads website is currently a prototype, testing a database built upon the RSLP collection level description database, covering the collections relating to the potteries industry of North Staffordshire.

The record provides additional information about the service which reminded me about the meetings I attended several years ago about this project. I was interested to see what the Crossroads West Midlands service now looks like, so I followed the link to the address – and, rather than a service providing access to a database of cultural heritage resources in the West Midlands, I found a page full of links to services such as golf, gambling, estate agents, motor insurance, etc.

Crossroads West Midlands Web SiteClearly at some point the domain name for the original service had lapsed and was purchased by a company which used it to host advertisments and links to companies which would be willing to advertise in this way (or possibly companies wishing to enhance their search engine ranking may have procured the services of a Search Engine Optimisation service and might not be aware of the approaches taken.)

I was interested in the history of the Web site. Using the Internet Archive I discovered that the Web site was first archived on 26 September 2002. At this point the information in the archive contained details about the project. The service itself was first launched around February 2003. And the service disappeared to be replaced by an advertsiment site at some point between December 2005 and April 2006.

What happened? Did project funding run out? Did key staff leave? Or was there a blunder, with nobody receiving the email requesting renewal of the domain name?

Whatever the reason, this West Midlands Crossroads service has disappeared for sight. Is this inevitable? Well back in 1999 I was the project manager for the Exploit Interactive e-journal– an EU-funded project which ran until 2000. Once the funding had finished we had to decide what would happen with the domain name. We agreed to continue paying for the domain for at least 3 years after the project funding had ceased and would try to keep the domain for a period of 10 years. This policy was informed by a survey I carried out of project Web site funded by the EU-funded Telematics for Libraries programme. As I described in an article published in Exploit Interactive in October 2000 23 Web site had disappeared of the 103 projects funded.

We are seeing a disappearance of cultural resource and EU-funded projects from the digital environment. And this may well get worse, if the UK Government’s policy of centralising its Web sites, which will result in 551 Web sites being closed down, is not managed properly. Will we, for example, find that the Drugdrive Web site at suddenly becomes a site used for selling drugs?

What is to be done? The good news is that the Government does seem to be handling its redirects properly – the Drugdrive Web site, for example, is redirected to

Well done, the UK Government. But what about the rest of us? Are we managing the closure of Web sites? And are we assessing the risks of failing to do this? After all, if a government Web site on protection of children from dangers on the Internet became available and was bought by a pornography site, we could well see a government minister being forced to resign

Posted in preservation | 3 Comments »

Come Into My World

Posted by Brian Kelly on 28 March 2008

Back in December 2007 Lorcan Dempsey wrote a blog post about the Nexus Facebook application, which provides a visualisation of your friends in Facebook. The highest density of his friends were his professional colleagues followed by “mostly UK friends (and the most highly connected nodes are people who work or worked at UKOLN“.

This seemed interesting so I installed the Nexus application and captured a screenshot of the representation of collections of my friends and contacts. As with Lorcan, the highest density represents professional colleagues across the UK Web management community. The second largest cluster, shown on the bottom right of the image, are my rapper sword dancing and folkie friends.

The Nexus Facebook Application

It’s possible to interactive with the data, exploring who knows who and explore what the links are.

The concluding remark Lorcan made on his blog post was “Not sure it means much, but it was interesting to play with for a while ….“.

I agree with Lorcan that it’s fun to play with. But can it be used in any meaningful fashion? I’m inclined to think that it may have some potential in the support of information literacy.

Could this tool be used by students to explore the relationships across their groups of friends. Perhaps one could suggest that the students write a Daily Mail style expose´ based on the premise that “It’s 2028 and Carl Marks is the new leader of the Labour Party. Our Social Networking History Correspondent has managed to unearth the shocking details of what Carl got up to as a student. Read pages 1-5 for the shocking truth“. Or, in the interests of balance, write a article for the New Marxism Today on “On the day Prince William ascends to the throne we describe his student lifestyle“.

NOTE (added on 1 November 2012): This service was shut own on 7 October 2009.

Posted in Facebook | 4 Comments »

It’s Not New Labour vs Old Labour, It’s Cato And Cicero (typos fixed)

Posted by Brian Kelly on 25 March 2008

I’ve previously suggested that there’s a need for political realism in the debates over ownership of social networks and the general direction of Web 2.0. And I’ve suggested that Old Labour is dead and any expectations that the government will start nationalising services is being naive.

Well, I got that wrong didn’t I! However lefties in the US and Canada will probably be disappointed that the Government’s nationalisation of Northern Rock doesn’t herald a return to socialist principles – indeed even the Daily Mail acknowledges that nationalisation “is extremely rare and embarrassing for Labour“.

I think my mistake was in attempting to use political analogies which are still too relevant to many and capable of being reinterpretted in different ways.

So I was really pleased to read Martin Weller’s post on Downes vs Wiley – Cato and Cicero revisited on his Edtechie blog. As Martin describes:

Cato and Cicero both believed passionately in the same higher level goal, ie the establishment of the Roman Republic. Yet they frequently clashed about what was the best way to achieve it. In the same way I think Stephen (Downes) and David (Wiley both believe passionately in the overall aim of open education, but have differing views as to how it should be realised.

Cato was the purist, unbending and uncompromising. Cicero was the pragmatist, willing to compromise and work with a range of people to advance the republic. Cato often thought Cicero compromised too much, thus rendering his beliefs invalid. Cicero was often infuriated that Cato wouldn’t compromise and through this played in to the hands of the anti-republicans.

In his post Martin was suggesting that Stephen Downes’ objections to the Cape Town Declaration were based on the declaration’s inclusion of commercial entities, with Stephen arguing that “… the internet is already awash with really vile and intrusive commercial activity, do we have to export it too? We have the opportunity to do something really special in the world; why do we have to carve into every declaration of principle a paean to Things As They Are (and Those Who Profit From Them)?“.

Now I have to admit that, although my knowledge of Cicero and Cato is limited to having read Imperium, I have (mostly) taken a pragmatic approach to life generally and IT development in particular.

This struck me today when I read an article in CILIP Update about the inclusion of advertising leaflet in books borrowed from libraries and then returned home to find that my new passport had arrived – and a leaflet from a local estate agent was included in the letter (together with one from the NHS inviting me to join the NHS Organ Donor Register).

Now I personally don’t have any great concerns about the inclusion of adverts in library books or with my passport. Indeed if the income this generates can improve the quality of their services, then I would suggest that this is a good thing.

These particular issues, of course, aren’t about technologies. And neither, fundamentally, are the issues about ownership of social networks and use of commercially-provided services in the provision of educational and cultural heritage services (although I do acknowledge that the nature of IT can add extra complexities to the debate).

We need to recognise that the debates on the specifics of Facebook’s ownership, Bill Gates plans for Microsoft’s future role in Internet services and Rupert Murdoch’s plans for his media empire will only go so far. The Catos (Catoers, Catoists?) followers of Cato will need to convince the followers of Cicero that there vision have a realistic chance of being implemented, otherwise the debates are doomed to be endlessly repeated.

Posted in General | 3 Comments »

How I (Inadvertently) Helped A Microsoft Patent Claim

Posted by Brian Kelly on 23 March 2008

I was recently using Google Scholar to try and find out more about the impact of my peer-reviewed publications.  Initially I was looking at papers published since 2004, but I then thought it would be interesting to see how far back the citation data might go. 

So I used Google Scholar to find out about links to my paper on The Evolution of Web Protocols which was published in the Journal of Documentation in 1999 (Vol. 55, No. 1 January 1999, pp. 71-81).

I discovered two citations to this paper: one in course material for a course on Organization of Information written by the School of Library and Information Studies at The University of Alabama and, much more interestingly, one in a US Patent claim!  The title of the patent is “System and method for discovering information about web resources “. And, as can be seen from the Google Patent Search, the patent was filed in February 2002 and issued in August 2007, with the assignee being Microsoft Corporation! 

The first part of the patent states that the claim is based on:

A computer-implemented method for identifying metadata about a first resource identified by a first Uniform Resource Identifier (“URI”), the method comprising:

issuing a request for the first resource identified by the first URI;

receiving a response document from the first URI;

parsing the response document received in response to the issued request, wherein the response document includes a second URI for accessing a second resource, wherein the response document includes an indication that metadata about the first resource exists on the second resource, wherein the indication indicates a metadata format;

generating a request to retrieve the metadata from the second resource, wherein the generated request is formatted to support the metadata format identified by the indication; and

retrieving the metadata from the second resource.

The patent goes on to describe how this will be implemented:

The computer-implemented method of claim 1, wherein the response document comprises an HTML document and the indication comprises a LINK tag.

 Yes, the patent is based on use of the HTML LINK tag to link to a metadata description.

As my colleague Pete Cliff has pointed out to me;

OAI-ORE says you can include a resource map (which describes the agreggation of resources that make up (for example) a document – an article in the form of a Web page that includes images say)

<link rel="resourcemap" href="" mce_href="" />

The resource map is metadata. Does this mean that doing this now will require paying a fee to Microsoft?

How can this patent claim have been granted? And why was my paper cited in the patent?

Looking back at my paper I find that I stated that:

Metadata can be described as the missing architectural component of the web.

I went on to say that:

Work in this area included Netscape’s proposal on “Meta Content Framework Using XML” [32] which provides a specification for describing information structures (metadata) for collections of networked information using XML and Microsoft’s “Web Collections using XML” [33] proposal for providing a metadata framework which can be used for a variety of applications, such as sitemaps, distributed authoring and content labelling.

Both of these proposals recognised the importance of XML for representing the syntax of the metadata. The proposals, together with other related work, led to the development of RDF, the Resource Description Framework, which provides a framework for metadata giving interoperability between applications that exchange machine-readable information on the Web [34].

At the time of writing (July 1998) work in developing RDF is still at an early stage. However RDF does seem to provide a mechanism for pulling together the various related metadata components and adding a new architectural component to the Web.

It seems the patent claim cites my work as evidence that use of the <LINK> tag to embed metadata was not envisaged back in 1998. However my paper was never intended to do provide a complete description of the architecture of Web. And I am sure that there will be examples of use of the <LINK> tag for this purpose prior to the submission of this patent in 2002.

My paper clearly has had an impact which I hadn’t expected! However rather than flaming me for helping Microsoft to patent use of metadata in Web pages :-) I’d much rather the readers of this blog provided examples of prior art and suggested ways in which nthis patent can be overturned.

Posted in General | Tagged: | 1 Comment »

PLE 1.0 and PLE 2.0

Posted by Brian Kelly on 21 March 2008

The Debates

Martin Weller has recently commented on his Ed Techie blog that there has been a lot of discussion about PLEs (Personal Learning Environments) recently, and the relationships between PLEs, VLEs, TLEs (Teacher Learner Environment) and DPLEs (Default PLEs). Andy Powell has also discussed PLEs and PREs (Personal Research Environment) is a recent post on P vs. P in a user-centric world: the first of three posts he has written prior to our joint UCISA presentation.

PLE 1.0

This made me think about what I understand by the term PLE. And I realised that my first experience of a PLE was in primary school in the 1960s – back then a PLE was a Pen Learning Environment!  And I was around at the time of several technological innovations as well as different ways in which the Pen Learning Environment (which in this post I’ll refer to as PLE 1.0) was used to support my learning.  When I started at school I have vague recollections of using a ‘scratch pen’ which we dipped in the ink well on our desk.  However this was soon made obsolescent by the ‘biro’ technology.  But when I passed my 11-plus and went to grammar school I remember one teacher who didn’t approve of ‘biro; technology and insisted that all of his homework had to be submitted using a fountain pen.  But such technological luddism wasn’t sustainable, and I think that only happened in my first year.  By the time I was a teenager I was free to use a biro.

The initial focus of control was clearly on the technology itself.  But I have only recently realised the different pedagogical approaches which accompanying PLE 1.0. In some classes the PLE was used to write down what the teacher had written on the blackboard. However other teachers (or did this reflect other disciplines) the inefficiencies of the teacher having to write on the blackboard were removed, and we had to copy directly from our text books.

It was only later on the the teachers seemed to lose interest in controlling the technologies used and allowed me, the learner, the flexibility to make notes as I preferred.

PLE 2.0

What can PLE 2.0, the Personal Learning Environment, learn from my experiences in the 1960s and 70s? I think our institutions are still focusing too much on the technologies themselves and ways in which the technologies should be used – scratch pens, biros and fountain pen debates revisited. And there seems to be a tendency to be seek the best solution and make that the norm for all students – a Parker pen for all!  But what we learnt from our writing instruments was the advantages to be gained when the technology became invisible, and we were free to make our own choices. (but when, I wonder, did personalised pens become prevalent?)

The ideal PLE (to drop the versioning I introduced in this post) should surely follow the pen in becoming technologically invisible, and just something that the learner uses to support their tasks? And, perhaps more importantly, the institution’s response should be to provide the flexibility needed to support this approach.

Posted in General, Web2.0 | 2 Comments »

NDAP 2008 Conference

Posted by Brian Kelly on 18 March 2008

I’m pleased to report that this week I am participating in the NDAP 2008 conference which is being held in Taipei, Taiwan.

NDAP (National Digital Archives Program) was launched in 2002 with the objective of promoting and coordinating content digitization and preservation at leading museums, archives, universities, research institutes, and other content holders in Taiwan. The NDAP International Conference aims t0 provide a forum to encourage and facilitate interaction, collaboration, and dialogue among specialists in digital archives from different countries.

There’s a good programme which starts today with an opening talk on “Digital Preservation: Where are we now? Where are we going?” by Deanna B. Marcum, Library of Congress, USA. I’m also looking forward to this afternoon’s Creative Commons/IPR Session. Tomorrow sees sessions on Digital Preservation, Biodiversity and Archives. The Museum 2.0 session on Thursday morning will give me an opportunity to catch up with Jennifer Trant and Sebastian Chan and in the afternoon I’ll be speaking in the Library 2.0 session.

I’m not sure what the network access will be like at the conference but I’ll try and publish reports on the sessions.

Posted in Events | Tagged: | Leave a Comment »

Revisiting Web Usage Metrics

Posted by Brian Kelly on 17 March 2008

I recently wrote a post on The UK Government and Web Metrics in which I described potential ambiguities in reporting on the usage of Government Web sites. In a comment on the post Phil Wilson oberved that

This extract from Hansard only really tells me one thing: there isn’t a government-wide standardised hit-tracking/visitor analysis scheme. 

That’s true – and the temptation would be to recommend the adoption of an industry standard, such as that provided by ABCE. As this page says:

The ABC international standards working party (IFABC, International Federation of Audit Bureaux, has developed a set of rules and definitions that are the effective world-wide standard for Web audits. Definitions and rules specific to the internet industry in the UK and Ireland are controlled and developed by JICWEBS, the Joint Industry Committee for Web Standards. ALL current Industry agreed metrics are listed below (in alphabetical order):

Great, we have a standard which can be used for measuring Web usage.

The problem is, what if the content of a Web site is syndicated? What if users don’t visit the Web site to read the information, but expect the information to come to them, via their preferred RSS reader? 

This struck me when I viewed the usage statistics for my initial post on The UK Government and Web Metrics. At one stage all I could view via the administrators interface on the service was the overall hits on pages on my blog. But some time ago WordPress provided a display of syndicated accesses to blog posts, as can be seen in the image.

Web usage statistics for a blog post

Now what would I report on the day the post was published if I was making use of the ABCE’s standard for Web site usage? Less than 40 page views on the day the post was published, and a drop in views after that.  The statistics showing the much higher syndicated views of the post would fail to be reported.

OK, so the usage data is flawed – but everyone knows that.  The danger, of course, if usage data becomes competitive, with services failing to be funded if the usage levels as recorded by Web site visits doesn’t reach acceptable levels. And what will providing RSS feeds to services do – it may provide a richer and more personalised ervice for the end user, but the Web usage figures as reported by tools which comply with the ABCE standard will drop.

Here’s an example of how use of an agreed international can potentially result in a failure to develop richer service for the user community.  Now I’m not saying that we shouldn’t have an agreed baseline for usage statistics. Rather the Web site usage needs to be analysed in conjunction with an understanding of alternative ways in which users may access the data.  And I don’t know if there’s a standard available for this. 

Posted in rss | 3 Comments »

Final Score: 250 to 3 Victory for IT Services 2.0!

Posted by Brian Kelly on 14 March 2008

On Wednesday night Martin Weller and I were simultanaously sharing (via Twitter) the joy of a fightback, the tensions of extra time and the final failure of both our teams in the penalty shoot-out.

On Thursday morning, however,  whil I travelled to London for a meeting Andy Powell spoke at the UCISA 2008 Management Conference, Following my video presentation Andy gave his contribution to the talk on “Digital Natives Run by Digital Immigrants: IT Services are Dead, Long Live IT Services 2.0!“. How slides are available on Slidshare:

And as Andy described to a live Twitter audience (which I only caught up with later that day)  there was a debate at the conference on “this house belives (sic) that University IT services should block access to social networking sites“.

Andy reflected on the debate:

odd debate here… some people taking the motion very seriously… others treating it as a joke – hard to judge if people are seriously … … 

it’s a serious motion – though obviously positioned intentionally to stir up debate – but yes, basically it is daft

sanity prevails… only 3 out of about 250 IT Services directors voted in favour of blocking student use of social networks

Good news then :-) It seems IT Service managers overwhelmingly recognise that they can’t stop users accessing social networking services. But how was our talke received? Michael Webb has been blogging from the conference. He gave his views on my video presentation:

Anyway, morning themes were about Web 2.0/Social networking, starting with Brian Kelly from UKOLN and Andy Powell from EduServ – talking about IT Services 2.0. Brian wasn’t actually their though, and instead had pre-recorded his presentation. I find this pretty fascinating – I’ve had loads of discussions with people about why we don’t do this more often (we do actually do this for our IT induction), but it’s the first time I’ve experienced it as an audience member. So did it work? Somewhat against my expectations (Brian is a very engaging presenter in person) it worked fine (even with the low production values and a phone ringing half way through!).

And then went on to briefly summarise the content of my talk:

What about the content? Essentially the premise was that IT Services have evolved before, and can do so again, into IT Services 2.0 where we embrace, support, and educate users about the possibilities of externally hosted Web 20 services. 

Michael’s thoughts on the views expressed by myself and Andy:

So where does that leave us? The common theme between Brian and Andrew’s talks were they were both saying we need to understand risks. Some of the risks, in my opinion (and, I think, Brian’s) aren’t that great – service reliability for example – how often is Google or Facebook down? Privacy of data across national borders though is a really challenging issue, and perhaps one of the most obvious stumbling blocks to wholeheartedly embracing some externally hosted technologies on an institutional level.

There’s another significant issue though – we don’t really have any control of this do we? Our work and home life and identities are becoming increasingly blurred – we can’t ban people from using Facebook to support learning. So how much user education are we actually responsible for, both from a moral and legal perspective? It’s something we all need to give more thought to.

Later on at the conference there were “two supplier presentations – one from Google, and one from Microsoft, both promoting their free, web based email/productivity/web 2.0 suites.” Michael made an interesting comment on the tensions between the views of Myself and Andy that IT Services should move towards playing an enabling role rather than the provider of IT Services and encouraging Microsoft or Google to provide core IT services:

Second issue, and I need to reflect on this a little more, is that doesn’t this go against the IT Services 2.0 philosophy? We’d still be imposing a single tool set on our students (albeit an outsourced one) rather than educating our users to pick the best tools for any given activity. Maybe that’s an impractical aim – remember back to Sir Alan Langlands plea to keep things simple for academics? Don’t know – my instinct is that this sort of approach is still a very IT Services 1.0 things. Sure, Google Apps (say) may be a great tool set for a certain group of users for a given activity, but maybe another group or activity would work better with Elgg or WetPaint? I think this gets right to the heart of the IT Services 2.0 dilemma – how much technical diversity can our user base sustain? Or am I missing the point?

Now I don’t feel that making use of Google Apps should prevent ue of Elgg or WetPaint – unless your institution has foolishly agreed to a contract which requires the institution to only allow a single provider  of a service on campus (and I’ve heard this has happened with VoIP, which means institutions are contractually obliged to ban Skype from the campus :-()

But how use of Google and Microsoft externally-provided services relate to a vision of small pieces loosely connected vision is an interesting question!

Posted in Web2.0 | 12 Comments »

The UK Government and Web Metrics

Posted by Brian Kelly on 12 March 2008

Spotted recently on Hansard (25 Feb 2008):

Departmental ICT

Norman Baker: To ask the Secretary of State for Innovation, Universities and Skills how many hits the (a) most popular website and (b) least popular website run by his Department has received since 1 January. [162286]

Mr. Lammy:The Department for Innovation, Universities and Skills corporate website was launched on 28 June 2007, following the machinery of Government changes and creation of the new Department. The numbers of hits for the most and least popular websites that come under the DIUS remit are as follows:

Website Number of hits( 1) from 1 January 2007 to 25 October 2007
The Intellectual Property Office ( 236,301,690
Technology Strategy Board ( (2)82,370
(1) Please note that a ‘hit’ is simply a successful request to the web server from a visitor’s browser for any type of file, whether an image, HTML page, or any other type. A single web page can cause many hits, one for each image included on the page. (2) Figures are form page views from 1 July 2007 to 25 October 2007 as hits are not measured for this site.

Now what is worse, I wonder? The fact that Norman Baker, Lib Dem MP for Lewis is asking about the popularity of UK Government Web sites based on such simplistic criteria or the Government’s response which compares ‘hits’ with ‘page views’? Even worse is that the official response is so defensive about having to provide figures on ‘page views’ (which is a legitimate measure on Web site usage) as data on hits (which reflects the Web site design and not the popularity of the Web site) are not measured.

Even worse is that the response compares a Web site domain ( with a Web site area (

And the latter Web page is not longer available – although I suspect that it refers to

Perhaps we shouldn’t be surprised that a Government Web page which no longer exists isn’t particularly popular!

But what worries me most about such absurdities are the implications of the Government’s increasing preoccupation with such (flawed) measures of impact and the responses which might be expected from the Government critics.  I could easily envisage a Daily Mail leader article being critical of a drop in the numbers of ‘hits’ to Government Web sites, ignoring the realities of technological enhancements which may mean that although the numbers of hits or page views go down, the user may actually be getting a much more valuable and useful experience (e.g. the data being surfaced in other areas).

Posted in General | 8 Comments »

My Talk At The UCISA 2008 Conference

Posted by Brian Kelly on 10 March 2008

I mentioned previously my talk on “Digital Natives Run by Digital Immigrants: IT Services Are Dead – Long Live IT Services 2.0!” which I’ve been invited to present at the UCISA 2008 Management Conference. In my post I described the background to this talk and invited feedback on the slides which, together with an audio track, is available on Slideshare.

I was particularly struck by the comments made by Martin Weller:

Hi Brian – I have finally shed all institutional services – it’s marvellously liberating. And this is just the basic stuff – I have also evolved a PLE/PWE (for want of a better term). IT services simply can’t compete – just look at the email – my mailbox was full at the OU. With GMail I am using 1%. That’s an order of magnitude difference. And the same applies with every tool you care to mention in lots of different ways – design, usability, robustness (the idea that IT services hosted tools are less robust doesn’t stand up). 

Martin provide further information on how he sold his soul to Google on his own blog. The suggestion that I’ve made previously that IT Services need to transform themselves to take into account the Web 2.0 environment is clearly demonstrated by Martin’s actions.

As I have another meeting which clashes with the UCISA conference I won’t be able to give my talk in person. However a video presentation of the talk is available in various formats, including this one which is hosted on the Zentation service.

IT Services Are Dead – Long Live IT Services 2.0!
Talk on IT Services Are Dead – Long Live IT Services 2.0!

Andy Powell will be co-presenting at the UCISA Conference – and Andy will be physically present :-)  Andy has already posted some of his thoughts on what he’ll be saying. In his post, entitled P vs. P in a user-centric world, Andy focusses on the “move towards user-centricity … and in particular the use of the word ‘personal’ in both Personal Learning Environment (PLE) and Personal Research Environment (PRE)“.

Martin Weller provides a good example on how individuals are beginning to select their own preferred set of IT tools, and no longer feel constrained by the tools provided by the institution.  But is this the start of an inevitable trend or will it be limited to small numbers who are highly skilled in use of IT?  What about the pitfalls? And how should IT Services respond?

Time permitting, Andy Powell with address comments made on this blog and on his eFoundations blog at the UCISA conference. Here’s an opportunity to make your voice heard.    

Posted in Events | Tagged: | 9 Comments »

Top of the Pods, Podpickers

Posted by Brian Kelly on 7 March 2008

Which UK University has the most popular podcast? This question occurred to me recently after visiting a page on the JISC Web site in order to subscribe to JISC podcasts. Following the link launched iTunes and allowed me to subscribe to the podcast, so that new podcasts are downloaded automatically.

I noticed the search option in iTunes and thought I’d search for University podcasts. The most popular podcasts came from Vanderbilt University but in third place was Oxford University. And listening to the start of the current podcast I discovered the title was “Podcasts from Medieval English lectures”. So much for the dumbing down of the iTunes generation! Who’d have thought that all of those young students with their white ear pieces were catching up on Chaucer – perhaps “The Wife of Bath’s Tale”!”

UK University Podcasts

In second and third places for UK universities were the universities of Edinburgh and, I’m pleased to say, Bath. The University of Bath not only has the kudos of a top three place in iTunes popularity, the public lecture podcasts at my host institution recently won a European award for its podcast series. As the press release announced “its podcasts had from November 2006 to September 2007 been seen (sic) 188,000 times“. The press release went on to say that “Our podcasts are popular enough to get us featured in the top 50 podcast originator on i-Tunes in the “Science and Medicine” section, ahead of any other university in the world.

I think this is a great example of an institution successfully engaging with a popular Web 2.0 services (ITunes) in order to maximise its impact. My congratulations to the Audio Visual and Web Services teams at the University of Bath.

But apart from Oxford, Edinburgh and Bath, where are the other UK universities? There don’t appear to be any in the top 50 places in iTunes, although I did spot Aberdeen in about the 68th position followed by a cluster of the universities of Swansea, Westminster and Cambridge. Are UK Universities missing out, I wonder?

Posted in Web2.0 | Tagged: | 9 Comments »

Workshop On Risk Management

Posted by Brian Kelly on 5 March 2008

The JISC OSS Watch service are running a workshop on “Risk Management in Open Source Procurement” which Ross Gardler describes in a blog post on the OSS Watch Team blog.

The background to this event, which will be held in Oxford on 18 March 2008, is described in an article on open source in HE and FE published in the October 2007 edition of JISC Inform in which Ross suggested that:

There is often a lack of understanding about how best to consider OSS as part of institutional IT procurement and development activities. Ross Gardler, manager of the HE and FE advisory service for open source software, believes such issues can be explained by difficulties surrounding evaluation techniques.

‘There often isn’t an established marketing department that will take you out for lunch and smooth talk you about the potential benefits, like there is with a commercial provider,’ he says.

I can recall that about 10 years ago there seemed to be a feeling that having source code available under an open source software licence was sufficient to guarantee sustainability of software. But you just have to look at example such as the ROADS software which drove a number of what are now know as the Intute hubs. Looking at the graveyard of many open source software projects which fail to be sustainable in the long term, you’ll find an area for ROADS. We do need to do the risk analysis and risk management.

So I’m pleased to see that OSS Watch are running a workshop which will cover the risks associated with procurement of open source software. In his blog post Ross goes on to describe how the OSS Watch service “provide[s] one-to-one consultancy services to help people understand how to evaluate open source and open source providers using frameworks such as the Business Readiness Rating and the Open Source Maturity Model.” The workshop will provide an opportunity for OSS Watch to share their expertise with a wider community.

Of course, there’s not risks risks aren’t only associated with open source software – there are risks associated with use of proprietary software. And also, it needs to be said, use of externally-hosted Web 2.0 services – as we saw recently with the recent downtime of the Amazon S3 service which affected other services including Twitter.

This doesn’t mean, however, that we shouldn’t use externally hosted Web 2.0 service – or, indeed, open source software. Similarly the recent crash of the Northern Rock Bank doesn’t mean that we should withdraw our savings and stuff the cash under our mattresses!

I suspect that a workshop on “Risk Management and Web 2.0″ would be popular. I’ve posted previously on Your Views On Externally-Hosted Web 2.0 Services back in September 2007. But, apart from the risk assessment document which have been produced at the universities of Oxford and Edinburgh, have any other institutions published anything in this area?

Posted in Web2.0 | 2 Comments »

IT Services Are Dead – Long Live IT Services 2.0!

Posted by Brian Kelly on 3 March 2008

Back in March 2004 I was pleased to be invited to give a talk at the UCISA Mangament Conference on “What Can Internet Technologies Offer? in which I introduced a raft of collaborative and communications technologies which are now referred to as Web 2.0 to about 350 senior managers in IT Service departments. Two years later I was invited back and I gave a talk on “IT Services: Help Or Hindrance? ” in which I argued that IT Services needed to actively engage in providing access to services such as blogs and wikis, otherwise there would be a danger that central services would be marginalised.

I’m pleased to say that IT Service directors seem to like my talks as I’ve been invited back again this year to speak at the UCISA 2008 Management Conference. The title of this year’s talk is Digital Natives Run by Digital Immigrants: IT Services are Dead, Long Live IT Services 2.0!” and the talk will be given on 13 March 2008. Unfortunately I have another meeting already arranged  for that date – but rather than this being a problem I regard it as a useful opportunity to make use of another set of technologies and approaches to presenting. So I have prepared the initial draft of my slides, and have made it available as a Slidecast (i.e. with an accompanying audio track) on Slideshare.

This 15 minute presentation only provides a high-level view of my thoughts on why IT Service departments need to engage with use of third party services. But I’m pleased to say that Andy Powell will be a co-presenter and will be attending in person. Andy will be giving his views on the implications of Web 2.0 on IT Service departments,  and will be able to respond to questions form the audience.

But rather than my talk simply being presented on the day, in the spirit of openness which I write about recently in the context of open science, I would like to invite comments on my talk in advance of the conference, which Andy may be able to integrate in his presentation. And, as an article on Technology Populism: Risks & Rewards points out, there can be risks to the organisation when users circumvent IT Service departments.


Posted in Events, Web2.0 | 4 Comments »

The Demise of Netscape Navigator

Posted by Brian Kelly on 1 March 2008

Netscape Navigator logoAn article entitled In praise of … Netscape Navigator announced  that today (Saturday, 1 March 2008) sees the official end of support for the Netscape Navigator Web browser.

The “In praise of” column does indeed praise Netscape for “opening the web, [and] pav[ing] the way for everything from Google to Wikipedia“.

What the column doesn’t say is the that the browser went from strength to strength after it was launched by ignoring standards bodies and introducing several new proprietary HTML extensions which infuriated HTML standards groups when they were released. As an article in Wikipedia describes:

Through the late 1990s, Netscape made sure that Navigator remained the technical leader among web browsers. Important new features included cookies, frames, and JavaScript (in version 2.0). Although those and other innovations eventually became open standards of the W3C and ECMA and were emulated by other browsers, they were often viewed as controversial. Netscape, according to critics, was more interested in bending the web to its own de facto “standards” (bypassing standards committees and thus marginalizing the commercial competition) than it was in fixing bugs in its products. Consumer rights advocates were particularly critical of cookies and of commercial web sites using them to invade individual privacy.

But why is the Guardian praising Netscape, if the company behaved in this fashion? Well I think the Guardian was right when it says that “Everyone from secretaries to salesmen started logging on” thanks to the initial success an popularity of the browser.  But let’s not rewrite history and suggest that this was due to the software vendor supporting old standards – rather, and ironically, its success was due to flouting the standisation processes and forcing innovations (which, in some cases, subsequently became standardised) through seeking to position itself as the dominant vendor in the marketplace.

Netscape Navigator usageOf course, although they were the dominant player for a short period, this did not last, with Microsoft’s Internet Explorer browser eventually finding itself as the world’s most widely-used browser, despite the appeal which FireFox has to its admirers.

Strange how  things turn out.

Posted in browser | 2 Comments »