UK Web Focus

Innovation and best practices for the Web

Archive for August, 2008

Blog Day 2008

Posted by Brian Kelly (UK Web Focus) on 31 August 2008

A tweet from joeyanne alerted me that today is Blog Day 2008.  As I only found out about this at 6.30 pm today I will have to be brief in my list of blogs that I find interesting.

The instructions for contributing to Blog Day are:

  1. Find 5 new Blogs that you find interesting
  2. Notify the 5 bloggers that you are recommending them as part of BlogDay 2008
  3. Write a short description of the Blogs and place a link to the recommended Blogs
  4. Post the BlogDay Post (on August 31st)

My blogs, which may not necessarily be new to many readers of this blog, I’m afraid are:

  1. The Ed Techie blog by Martin Weller, Professor of Educational Technology at the Open University – and someone I have had valuable Twitter discussions with.
  2. The unspun Electronic Museum blog, in which Mike Ellis argues passionately for the adoption of light weight Web 2.0 approaches within the museum community.
  3. The Digital Curation blog in which Chris Rusbridge, in particular, provides a remarkably refreshing insights into preservation issues, even going as far to ask whether the “Digital Preservation’ term [should be] considered harmful?“.
  4. The JISC Access Management Team blog, probably the liveliest of the blogs published by JISC programme managers.
  5. I mentioned Tony Hirst’s Ouseful blog in a previous list of my favourite blogs, but as that referred to an old version of the blog I feel I’m allowed to mention this blog again, which Tony uses to write copious summaries of his prolific development activities.

And as today is Blog Day I thought this would provide an opportunity to launch the first of a series of brief video blog posts entitled Video blog 1: Why I Blog which I am publishing in order to support a workshop on “Using Blogs Effectively Within Your Library” which my colleagues Marieke Guy and Ann Chapman will be facilitating at the ILI 2008 conference.

If you are a blogger and want to give the reasons why you blog why not sign up to Seesmic and respond to my post, explaining why you blog. You never know, you might get mentioned when Marieke and Ann run the workshop.  There’s a marketing opportunity for you, especially if you are a blogging librarian.

Technorati tag: BlogDay2008

Posted in Blog | 2 Comments »

The Final JISC PoWR Workshop

Posted by Brian Kelly (UK Web Focus) on 29 August 2008

The final workshop organised by the JISC-funded Preservation of Web Resources (PoWR) will take place at the University of Manchester on Friday 12th September 2008.

Now you may think that preservation is a pretty dull topic, compared with the exciting developments that are taking place in a Web 2.0 environment. And if that’s what you think, then you’re not alone. As Alison Wildish, head of Web Services at the University of Bath described on the Web Services team blog:

We were asked by our colleagues at UKOLN (who organised the event) to deliver a brief talk detailing our approach to preserving web resources at the University. Our initial reaction was that we had little to say. Lizzie’s remit lies with the paper records and I am responsible for managing our website – ensuring it meets the needs of our users. Neither of us felt web preservation was something we had expertise in nor the time (and for me the inclination) to fully explore this.

And you can even listen to Alison and Lizzie Richmond (University of Bath records manager, archivist and FOI coordinator) expand on this by viewing the Slidecast of the talk they gave at the first JISC PoWR workshop:

If you listen to the end of the Slidecast you’ll hear Alison and Lizzie describing how they discovered in the course of the discussions reasons why Web preservation is a topic which needs to be treated seriously.

But how should one go about Web preservation? What should you preserve? What should one discard? What are the implications of use of Web 2.0 on preservation policies? Whose responsibility is this? What are the costs associated with preservation? And what are the costs and associated risks of not developing and implementing a preservation policy for your Web resources? And how does one ensure that an institutional preservation policy is sustainable and embedded withn the institution?

These are some of the topics which have been raised on the JISC PoWR blog and will be discussed at the workshop. But hurry up and book you place, as the deadline for bookings is Friday 5th September. And note that the workshop is free to attend for members of the higher and further education community.

And finally I should point out that the case study given by Alison Wildish and Lizzie Richard has been saved from being trapped in the non-interoperable world of the past, accessible only to Doctor Who (and even then only on a good day) by recording the talk and synching the recording with the slides and hosting this on Slideshare. You see, preservation can be enhanced through use of Web 2.0 services. Digital preservation can be cool – even though, arguably, it may kill the odd polar bear :-)

Posted in preservation, Web2.0 | Leave a Comment »

Defining An “Amplified Conference”

Posted by Brian Kelly (UK Web Focus) on 28 August 2008

The term ‘amplified conference’ was, I believe coined in a blog post by Lorcan Dempsey in which he observed that ” It is interesting to watch how more conferences are amplifying their effect through a variety of network tools and collateral communication“.

It will be noted that Lorcan didn’t seek to define what he meant by the term, but was merely observing a pattern of uses of networked technologies at events being made, in Lorcan’s example, at a number of JISC events, although such uses predate this as I described in a paper on “Using Networked Technologies To Support Conferences” published in June 2005.

But we don’t seem to have an agreed definition of the term. And this can be problematic, especially if we decide that we want to host an ‘amplified conference’.

So I thought I’d set the ball rolling by describing what I mean by an amplified conference.


The term amplified conference describes a conference or similar event in which the talks and discussions at the conference are ‘amplified’ through use of networked technologies in order to extend the reach of the conference deliberations.

The term is not a prescriptive one, but rather describes a pattern of behaviors which initially took place at IT and Web-oriented conferences once WiFi networks started to become available at conference venues and delegates started to bring with them networked devices such as laptops and, more recently, PDAs and mobile phones.

We can observe a number of ways in which conferences can be amplified through use of networked technologies:

Amplification of the audiences’ voice: Prior to the availability of real time chat technologies at events (whether use of IRC, Twitter, instant messaging clients, etc.) it was only feasible to discuss talks with immediate neighbours, and even then this may be considered rude.

Amplification of the speaker’s talk: The availability of video and audio-conferencing technologies make it possible for a speaker to be heard by an audience which isn’t physically present at the conference. Although use of video technologies has been available to support conferences for some time, this has normally been expensive and require use of dedicated video-conferencing tecnologies. However the availability of of lightweight desktop tools make it much easier to deploy such technologies, without even, requiring the involvement of conference organisers.

Amplification across time: Video and audio technologies can also be used to allow a speaker’s talk to be made available after the event, with use of podcasting or videocasting technologies allowing the talks to be easily syndicated to mobile devices as well as accessed on desktop computers.

Amplification of the speaker’s slides: The popularity of global repository services for slides, such as Slideshare, enable the slies used by a speaker to be more easily found, embedded on other Web sites and commented upon, in ways that were not possible when the slides, if made available at all, were only available on a conference Web site.

Amplification of feedback to the speaker: Micro-blogging technologies, such as Twitter, are being used not only as a discussion channel for conference participants but also as a way of providing real-time feedback to a speaker during a talk. We are also now seeing dedicated microblogging technologies, such as Coveritlive and Scribblelive, being developed which aim to provide more sophisticated ‘back channels’ for use at conferences.

Amplification of a conference’s collective memory: The popularity of digital cameras and the photographic capabilities of many mobile phones is leading to many photographs being taken at conferences. With such photographs often being uploaded to popular photographic sharing services, such as Flickr, and such collections being made more easy to discovered through agreed use of tags, we are seeing amplification of the memories of an event though the sharing of such resources. The ability of such photographic resources to be ‘mashed up’ with, say, accompanying music, can similarly help to enrich such collective experiences (such as the Animoto clips of IWMW 2007 and UKOLN’s Exploiting The Potential Of Blogs and Social Networks Workshop).

Amplification of the learning: The ability to be able to follow links to resources and discuss the points made by a speaker during a talk can enrich the learning which takes place at an event, as described by Shabajee’s article on “‘Hot’ or Not? Welcome to real-time peer review” published in the Times Higher Educational Supplement in May 2003.

Long term amplification of conference outputs: The availability in a digital format of conference resources, including ‘official’ resources such as  slides,  video and audio recordings, etc. which have been made by the conference organisers with the approval of speakers, together with more nebulous resources such as archives of conference back channels, and photographs and unofficial recordings taken at the event may help to provide a more authentic record of an event, which could potentially provide a valuable historical record.

Well that’s my initial attempt at trying to define what I understand by the term ‘amplified conference’.   I should add that in this post I’m not discussing any of the  limitations of amplified conferences (which I’ve commented on previously). My final comment is to point out that I actually organise ‘amplified workshops’ and ‘amplified seminars’ but neither of these terms seem to have the resonance of ‘amplified conference’.  So I suspect we should probably stick with this term to refer to a range of events.

Does this definition work for you?

Posted in Events, Web2.0 | 2 Comments »

MyBristol Toolbar

Posted by Brian Kelly (UK Web Focus) on 27 August 2008

I was alerted to the MyBristol portal via a tweet from Mike Ellis who commented on the URIs it uses:

Loader

woa – check out the beautiful friendly url’s on UPortal… http://tinyurl.com/5uwr8k

Now I’d agree that

https://portal.bris.ac.uk/mybristol/tag.7ef20678c7572c37.render.userLayoutRootNode.uP?uP_root=root&uP_sparam=activeTab&activeTab=1

is a rather ‘uncool URI’. But I was more interested in the MyBristol portal service itself and, in particular, the portal toolbar which is available for the FireFox browser:

The Add Newsfeed option “allows you to maintain a personalised set of newsfeeds“.  Wouldn’t it be great if every institutions provided a service like this, which allowed your news feeds and your bookmarks to be stored in a managed environment – if it would also allow such data to be seamlessly stored on your preferred external service as well (perhaps del.icio.us or Diigo for your bookmarks and Google Reader or Netvibes for your news feeds).

I feel that the ability to store such resources on a remote service is needed in order to gain the ‘network effect’ that popular remote services can provide. But I’d also like to have a managed local copy, so I wouldn’t have to worry if the remote service went down, its performance was unreliable or if I was concerned about the privacy implications of storing sensitive information remotely. And I’d like such services to work transparently so I wouldn;t have to worry about managing plugins myself.

Are such approaches being developed?

Posted in Web2.0 | 5 Comments »

Squirl: When Web 2.0 Services Break

Posted by Brian Kelly (UK Web Focus) on 25 August 2008

I have previously described that when you make use of third party Web 2.0 services you need to acknowledge the possible risks: yes, if you use Google Docs there are risks if Google goes out of business or the Google service is down. I have been willing to take such risks, especially with well-established and well-used services such as Google portfolio of services and other services such as del.icio.us and Slideshare.

But what about less well-known services? What happens if such services do break? After all, as my colleague Paul Walk has recently pointed out and “there is a growing, commonly-held belief that we are about to enter a global recession” and as “venture capital can become harder to find in a period of economic down-turn” Paul asks “is this a good moment for HEIs to begin a brave experiment with outsourcing services to remote companies?” .

An example of a Web 2.0 service which has become broken happened to me recently. In January 2007 I came across the Squirl service. I wanted to explore a number of Web 2.0 services, so I used Squirl to keep a record of the books I was reading. The service has links to Amazon, so I simply need to type in the title of the book, select the appropriate version and it will store a description of the book, including an image of the cover.

That was fine until and by February 2008 Squirl was keeping a record of  42 books.

But when I finished reading the next book, I found that the link to Amazon had stopped working.  I thought no more of it (it wasn’t a mission critical service, after all) but went back to several times afterwards, after reading more books.

Eventually I went to the Squirl groups and discovered a series of messages complaining about the service, as illustrated. And unfortunately there has been no response to any of the messages from anyone working for Squirl. It was also unfortunate, I felt, that Squirl didn’t provide a blog about their service, which I could add to my RSS reader and use various RSS filtering tools to help spot any worrying announcements or concerns raised by the users.

I can still create entries manually (although this does not pull in the images from Amazon). But as the service was still working apart from retrieval of the metadata from Amazon I wasn’t too concerned, especially as I had checked that there was a data export function when I signed up for the service.  But when I tried to export my data as a CSV file I got the following error message:

Sorry, we screwed up.

An email has been sent to somebody at squirl, and we’ll try and fix the problem as soon as possible. You might be able to find what you were looking for with the search engine above.

If the problem persists, please contact broken@squirl.info.

And rest assured, somebody is going to get a permanent letter in their file for this. I mean, heads will roll.

I first saw this error message back in February, I think, and I’m still getting the same message in August :-(  Even worse, when I send an email message to the address given above I find that the email address no longer exists.

Fortunately as the service provides an RSS feed of my data I have been able to retrieve my data. But this experience has helped to identify a number of approaches which one should take to help minimise such risks in the future. I think ideally the steps would be:

  • Find out details about who is providing the service.  Is it well-funded? Is it likely to, for example, be sustainable through the current troubled economic times?
  • Does the service allow the data to be exported? Can the data be exported in a rich format, allowing the service to be recreated without too much difficulties?
  • Check the data export functionality and import into a new service.
  • Possibly replicate the data in a complementary service (note this is something I do with this blog).

In addition to these points related to the service and the data I would also look to see if the  service provides announcements and discussions using blogs rather than, as in this case, forum software as I add feeds from the third party services I use to my blog reader which allows me to periodically check for any untoward discussions in a single place.

It might be felt that having to implement such processes for any Web 2.0 service could be very time-consuming. But, of course, across a community we are likely to find uses of such services being made by others. So perhaps what we need is to make use of social networks to share our experiences, and have mechanisms in place to alert others to any possible problems (and I’m alerting other Squirl users of problems with the service 6 months after I first spotted them).

Of course, in order to ensure that we have our risk assessment processes in place we will also need an audit of the services we use.  That’s a topic I’ll discuss in a future post.

Posted in Web2.0 | Tagged: | 6 Comments »

What Is JISC?

Posted by Brian Kelly (UK Web Focus) on 22 August 2008

I recently noticed a referrer link to this blog coming from the Answers.com Web site. I’ve not visited this site before so I thought I’d visit and use the service to find an answer to a question. The question I thought I’d ask was “What is JISC?” And, as shown below, I found that “The Joint Information Systems Committee (JISC) supports United Kingdom post-16 and higher education and research by providing leadership in the use of ICT (Information and Communications Technology) in support of learning, teaching, research and administration. JISC is funded by all the UK post-16 and higher education funding councils.“.

What is JISC?

This answer is taken from the JISC entry in Wikipedia. Similar results are found by asking questions such as “What is UKOLN?” and “What is Bath University?” as well as for more general questions such as “What is research” although for questions such as “What is education?” the answers are drawn from a variety of sources, with the Wikipedia definition to be found after results from sources such as The American Heritage Dictionary, Roget’s II: The New Thesaurus, Third Edition and the Britannica Concise Encyclopedia.

What are the implications of this? The first, unsurprisingly, is that if information about your organisation or your areas of interest are available in Wikipedia, then the Creative Commons licence which is assigned to the material will help to ensure that this information is surfaced in multiple locations.

And perhaps more subtly, if you don’t use Wikiepdia, or you require that your students don’t use Wikipedia, you may find that you are inadvertently using information held by Wikipedia and made available via others services such as Wikipedia. In the search for JISC the top entry was clearly labelled as coming from Wikipedia, but in the example of “What is education?” the first set of references came from more traditional sources of information, and if you scroll down you may miss the citation details for the entry from Wikipedia.

My view is that providing information about your organisation of the topics you care about in Wikipedia will help to maximise awareness of and an interest such information. And failing to provide such information on the grounds that people shouldn’t use Wikipedia is mistaken. But if you do make use of Wikipedia you should be careful to provide an objective and encylopedia-like definition and avoid the trap of the entry sounding like an advertisement:

JISC entry in Wikipedia

Posted in Web2.0, Wikipedia | Tagged: | 7 Comments »

The ILI Tenth Anniversary

Posted by Brian Kelly (UK Web Focus) on 21 August 2008

The Internet Librarian International Conference Is Ten

This year sees the 10th anniversary of the Internet Librarian International (ILI) conference. This year’s event, ILI 2008, will be held at Novotel London West, London, UK on 16-17th October 2008. And, unfortunately, it will be the first ILI conference I won’t be able to attend. I have spoken at all of the ILI conferences and have also been a member of the programme committee and chaired sessions for a number of years.

My Involvement In ILI Conferences

Details of all of my talks at ILI are available on the UKOLN Web site. In light of the forthcoming anniversary I thought it would be interesting to produce a timeline of my involvement with the conference. I used the Dipity software to produce the timeline of my involvement in the ILI conference series, as illustrated below (and I should add that an embedded version of this is available on the UKOLN Web site, which also provides access to a locally managed copy of the data, so that potentially the service can be recreated if the Dipity service is not sustainable).

ILI Timeline

The conference has been of particular relevance to UKOLN, as it has provided an opportunity to actively engage with the communities served by both of our core funders: the academic libraries and the JISC development community together with those working in public libraries. Producing this timeline has provided a useful opportunity to observe and reflect the topics which have been of interest to these communities over this time.

Talks On Web Standards

My first talk was entitled “New Standards on the Web” and I described emerging new Web standards, including a range of XML standards (XLink and XPointer) and RDF. Looking back at the presentation (and the references to related work such as Eric Miller slide’s on support for RDF in Netscape) I can see how naive I as in my expectation that the emerging new W3C standards would be quickly deployed in a mainstream service environment. I gave another talk on standards at ILI 2003 entitled “HTML Is Dead! A Web Standards Update” in which I avoided the complexities of Semantic Web standards and focussed on data formats including SVG and SMIL. Again I was soon able to appreciate that the market place had little interest in these standards, although my comments on the importance of and XML and CSS, for example, were appropriate and timely. The final talk I gave related to Web standards was given at ILI 2005 and was entitled “Facing The Challenges Of A Standards-Based Approach To Web Development“. Here I reflected on the failure of various Web standards to gain acceptance in the marketplace and described the ‘contextual approach to use of open standards’ which I had been involved in developed for the JISC to help avoid repeating the costly mistakes made in the past when open standards (e.g. Coloured Book software) had continued to be advocated even after their failures had been widely acknowledged.

Web Accessibility

A talk on “Benchmarking Of Library Web Sites” given at ILI 2002 included a description of use of automated Web accessibility testing tools. The following year, at ILI 2003, I took part in a Web accessibility panel session entitled “Web Site Accessibility: Too Difficult To Implement?” and this time I gave one of my first presentations in which I argued that the traditional approaches to providing accessible Web resources, based on implementation of WCAG guidelines, was flawed. Two years later the joint UKOLN/Techdis holistic approach to Web accessibility had been developed and at ILI 2005 I was able to run a half day workshop with Lawrie Phipps on “A Holistic Approach To Web Usability, Accessibility And Interoperability“.

Best Practices For Publishing E-Journals

ILI conferences have provided a dissemination opportunity for various projects I have been involved in. I gave a talk on “Electronic Magazines: Issues in Implementation” at ILI 2000 which described the EU-funded Exploit Interactive e-journal. The following year, at ILI 2001, Marieke Guy and myself ran a half-day workshop session on “Publishing Web Magazines, e-Journals & Webzines“, the first of four workshop sessions I have facilitated at ILI conferences.

Other Areas

Other topics which I’ve covered at ILI conferences have included advertising on Web sites (at ILI 2001), new devices on the Web (ILI 2002) and quality assurance for Web sites (a half day workshop at ILI 2004).

Web 2.0

Since ILI 2004 the main focus of my involvement at ILI has been related to Web 2.0. The first talk was entitled “Beyond E-mail! Wikis, Blogs and Social Networking Software“, with a talk on “The Sceptics View Of New Technologies” being given in a panel session at the ILI 2004 event.

A talk on “Email Must Die!” at ILI 2005 described the benefits of various Web-based collaborative and communications tools, and, at the same event I continued to argue the need to adopt a critical approach to the new technologies with a talk on “Folksonomies – The Sceptics View“.

I was invited to chair a session on Wikis at ILI 2006 and, due to the late unavailability of one of the invited speakers, also gave a brief talk on “Reflections On Personal Experiences In Using Wikis“. My main talk that year was on “Web 2.0 and Library 2.0: Addressing Institutional Barriers“.

Finally at ILI 2007 Kara Jones and myself ran a masterclass on “Using Blogs Effectively Within Your Library” and I gave a talk on “The Blogging Librarian: Avoiding Institutional Inertia“.

Returning To ILI 2008

I had intended to participate at the ILI 2008 conference, but as I have been invited to present a paper at the Bridging Worlds 2008 conference, I will unfortunately not be able to attend. I will be there in spirit, though with my colleagues Marieke Guy and Ann Chapman this year facilitating the half-day blogging workshop.

I would like to take this opportunity to give my thanks to everyone who has helped to make the ILI conference series such a great success, especially the conference organisers (including Marydee Ojala, Jane Dysart, Nancy Garman, David Raitt, Bill Spence, Jean Mulligan) and the people I’ve met at ILI (too numerous to mention, but I should include Michael Stephens, Mary Peterson, Frank Cervone, Karen Blakeman, Phil Bradley, Darlene Fichter and Peter Scott). All my best wish to everyone at ILI 2008 – and all the best for the next 10 years.

Posted in Events | Tagged: , | 3 Comments »

The Markmail Service

Posted by Brian Kelly (UK Web Focus) on 18 August 2008

In a recent tweet Matt Jukes alerted me to the MarkMail service. As Matt forms part of my trusted “interesting Web applications alerting services” I went to the Web site. What I found was a search interface across over 4,300 mailing lists. A search for ‘ukoln’ provided me with not only various posts containing this string, but also details of the person who made the post, the lists posted to and also, as shown, a graph of the numbers of posts over time.

Markmail Service

Initially I felt that the graph supported my view that email is dying, but a search for a more general term, “web”, showed me that this was clearly an inappropriate conclusion to make based on this evidence.

But perhaps of more relevance is the main point that Matt made in his tweet:

just discovered http://markmail.org/would be cool if jiscmail lists were searchable here as well..

Yes it would be great if JISCMail exposed its mail archives to third party indexing services such as MarkMail. But to do that (or rather to do that effectively) would require the JISCMail mail archives to provide ‘cool’ application-independent and persistent URIs (which they don’t currently do) and allow robot software to access the resources.  Doing this will, of course, require the service to commit resources to develop work and make changes in policies.  A popular and large scale service, such as JISCMail, would only be in a position to do this if they could see tangible benefits to their user communities. I hope the example of the MarkMail service illustrates the potential benefits of opening up one’s data to third party services.  I have to admit that I find the JISCMail search interface so poor that I seldom use it.  Exposing the data to other services (whether MarkMail, Google or whatever) would enhance access to data available in the JISCMail Web archives, without JISCMail having to wait for the underlying Listserv software to conform with fundamental Web architectural principles.

Posted in General | Tagged: | 5 Comments »

Fahrenheit 451

Posted by Brian Kelly (UK Web Focus) on 15 August 2008

I recently attended the JISC’s Innovation Forum. One of the most interesting of the plenary talks was given by HEFCE’s John Selby. In his talk John praised the work of the JISC and the JISC Services, but went on to warn of troubled financial times ahead for the educational sector. The glory days of the past 10 years are over, he predicted.

This was probably not unexpected. What did surprise me, however, was the figures John quoted which put the carbon cost to the environment on par with the cost of flying – both at 2%.

This generated much debate at the forum, and, later on at the conference meal and in the bar. Although people questioned the accuracy of these figures, and wanted to know how these figures were obtained, there was an awareness that the carbon cost of IT is an issue which the IT secure needs to address. I should add that I subsequently came across details of a forthcoming Government Goes Green conference in which Malcolm Wicks, Energy Minister, BERR was quoted as saying that

ICT is now responsible for around 2% of global CO2 emissions. The public sector, with annual IT spending of £14bn, has an important role to play in reducing this two percent. An increased focus on sustainable procurement and efficient use of IT products are two key areas that it needs to work on and I am very pleased to see a conference dedicated on this.

At the JISC Innovation Forum dinner I found myself sitting next to colleagues from the Digital Curation Centre (DCC). I suggested, partly in jest, that although there was a clear need for continued development of networked services which are popular with the users, we had to ask ourselves where the costs of preserving digital resources could be justified. If, as we learnt from Alison Wildish’s recent presentation at the first JISC PoWR workshop, those involved in Web development activities tend to focus on the pressing needs of their user communities and find it difficult to justify diverting scarce resources to preserving resources which are no longer of significant interest to the institution, why don’t we stop pushing the notion of digital preservation. And not only will this allow the development community to focus their efforts on responding to pressing user needs – but removing archived files from hard disk drives could result in significant savings in energy.

This approach would then both help the users and help save the planet :-)

As I’ve said this was intended as a joke, over our conference meal. But we realised that their may be benefits for the digital preservation community in making such suggestions. After all, preservation is widely considered as worthy but dull. If digital preservation was regarded as something radical, might it have a greater appeal to developers? Could those involved in digital preservation work – harvesting old Web sites and even implementing OAIS models – find themselves repositioned as members of an underground radical movement, secretly preserving digital artefacts for a society which regards such activities as unacceptable. Fahrenheit 451 for the 21st century, perhaps.

Save a Polar Bear campaign posterThe following day when I suggested this, I was told that there have been discussions about strategies for digital preservation which acknowledge that there are environmental factors which need to be addressed. It seems that there have been proposals that such preservation activities should be based in places such as Greenland and Alaska where the low temperatures may reduce the need for consuming energy to keep the disk drives running at acceptable temperatures.

Now scientists may point out that running large scale server farms in locations near glaciers and the ice cap may increase the rate at which they melt. But the ideas which were bounced around at the event did make me wonder whether centralisation of networked services (e.g. running applications hosted by Google or Yahoo or running our applications on Amazon’s S3 and EC2 servers) would be more beneficial to the environment than all of our institutions running our own local servers.

And perhaps such discussion might be useful in a teaching context. Does data curation, for example, conflict with environmental protection? If so, should we forget it? Or could this approach result in deletion of the very data that could save the planet

What do you think?

And if you’d like to take part in a viral marketing campaign which seeks to make digital preservation interesting by suggesting that it might be responsible for global warming, feel free to make use of the post which has been produced. And note that a Creative Commons zero licence (currently in beta) has been assigned to this resource, so you don’t need to cite the original source. Let’s be part of an underground movement :-)

Posted in Finances, preservation | 19 Comments »

Usage Statistics for the IWMW 2008 Live Video Stream

Posted by Brian Kelly (UK Web Focus) on 14 August 2008

The first live streaming of talks at a IWMW event took place at IWMW 2006, when we experimented with an in-house streaming service and use of the Access Grid.  The following year live streaming of the plenary talks was provided by staff at the University of York, and recordings of most of the talks were subsequently made available on Google Video.

On both occasions the numbers of people watching the live streaming video was low, with the maximum numbers of viewers being less than 20 at each of the events. Despite the low numbers we felt the service was valuable as it provided us with an opportunity to gain experience of not only various streaming technologies but also, and more importantly, the non-technical aspects of live streaming at events such as privacy, copyright, accessibility, etc.

This year’s IWMW 2008 event was held in the King’s Conference Centre at the University of Aberdeen.  I was not the only delegate who was impressed by the King’s Auditorium – as one person commented on the event evaluation form “Conference hall had great facilities and microphones meant that you could hear delegates questions“.

The venue also had an excellent AV facilities, and we were pleased that, once again, we were able to stream the plenary talks. The quality of the video was excellent, as you can see if you watch any of the videos of the talks.

But perhaps the most noteworthy aspect of the live streaming was the numbers of people who watching the talks. As can be seen from the accompanying diagram there were 160 people watching the videos on the final day of the event. As IWMW 2008 attracted 180 participants, with a number of them having to leave before the event finished I suspect we can say that there were more remote people watching Ewan McIntosh’s closing plenary talk on “Unleashing the Tribe” that there were in the King’s Auditorium. When I mentioned this to my director, Liz Lyon, she wondered whether we will soon reach a ‘tipping point’ in which live streaming of talks at large conferences in the digital library environment will be expected as a mainstream offering.

For that to happen, though, there will be a need to establish the business case for providing the streaming service, ensure that it is easy to use and ensure that the risks are being addressed.

The business case is interesting. Who should pay for the costs of providing a video streaming service for an event? Should the costs be taken from the participants who attend the event? Or should remote viewers who wish to access the video stream have to pay? Or perhaps event organisers should be looking for commercial sponsorship to cover the costs (although in light of the current economic turbulence, now is probably not a good time to suggest this).   I wonder, though, whether the costs be covered by the host institution. Once the AV equipment has been installed, can the support costs be included i the rental of the facilities – just as we are now starting to expect access to WiFi network being provided as standard.

Once the business case has been sorted, there will be a need to ensure that the service is easy to use (back at IWMW 2006 people wishing to view the streaming video service needed to install “Real Player and the Xiph Player Plugin or Windows Media Player with the illiminable Ogg Directshow Filters for Speex, Vorbis, Theora and FLAC, with Linux users needing MPlayer with Ogg Theora“). Nowadays users shouldn’t need to concern themselves with details of the technologies, as use of Flash seems to provide the interface to streaming services (although there may be issues about versions of Flash). However I suspect there will be a need to provide a back channel, to enable the remote participants to discuss the talks. There will also be a need for the remote participants to join in discussions with the local audience, especially if a WiFi network is available. There will be a need, therefore, to ensure that the back channel is not tightly coupled to the video streaming service.

Finally there will be a need to address the risks. This will include addressing issues such as privacy, copyright and data protection. In addition there will be a need to consider the quality of service and reliability of the streaming service, especially if the costs in providing the service have been made transparent.

And the more I think about such issues the more I wonder whether live streaming at conferences has reached a tipping point. Might it simply be too much effort to provide on a regular basis?

Posted in iwmw2008 | 5 Comments »

Revisiting Development Of Facebook Applications

Posted by Brian Kelly (UK Web Focus) on 13 August 2008

I recently commented that I was pleased to see that the JISC-funded EDINA service was engaging with a number of externally-hosted Web 2.0 services in order to “improve engagement with their user communities”. In my post I made an observation on the release of a Facebook application (one which provides access to the Suncat service). I was pleased to see that EDINA are willing to explore the potential of Facebook for providing a platform for accessing their service – in some circles Facebook is regarded as unacceptable, perhaps because of concerns over data lock-in and privacy concerns, but also on what might be regarded as ‘ideological grounds’. My view is that if such applications can deliver useful services to the users in a cost-effective manner, then that will probably be acceptable.

In response to my post Nicola Osborne, a developer at EDINA, commented:

If anyone has comments on the search app or features that should be added we’d be very keen to hear them as the gradual migration over to the new version of Facebook seems like a good time to reassess how our app is working and could be improved and expanded (it’s very basic at the moment).

Nicola’s comment is very timely as I think there is a need for a debate on exactly what it is we (developers and users) might expect from the development of such Facebook applications. We will also need to consider the resource implications in developing such applications and the longer term maintenance and support costs. 

The Facebook page for the Suncat page is shown below. It should be noticed that as well as the search interface itself (shown at the bottom of the image) the page also provides information about the service, allows users to become ‘fans’ of the application, provides a ‘minifeed’ of information about the application and has a ‘wall’ which provides a forum for user comments. What this would seem to provide is an open environment for discussions about an application and mechanisms for potentially for making contact with fans of the application.

If we look at the Copac Facebook application page developed by the JISC-funded MIMAS service we can see a related approach. Here we can see how the application can be added to (embedded within) other Facebook pages. I can also see my Facebook friends who have added this application. And as, in this case, the people shown are people whose views on digital library applications I trust this can potentially help me in deciding whether to install the application. And if, for example, my Facebook page is updated with a message saying that 50 of my friends have installed the Copac or Suncat application I’m likely to wonder what I’m missing. And if I install the application this may influence my Facebook friends. So the viral marketing aspect has the potential to enhance usage of a service which is made available in Facebook.

But if you actually use either of these application you will find that the experience is rather disappointing. Once you’ve entered a serach term and pressed submit you then leave the Facebook environment and are taken to the Suncat or Copac service. You do not have the seamless environment within Facebook you might expect.  And your use of of the service does not have any ‘social’ context – if you have installed the application you are not informed of the numbers of your friends who have searched for a particular item. And you might be relieved at this, as you may not want your friends to see what you have been searching for. But if this is the case, if searching isn’t actually a social activity, what then is the point of providing the service within a social networking environment such as Facebook?

The answer to this question may be that the marketing aspects that social networks can provide is regarded as beneficial to the organisation developing the service. And as we have seen with popular applications such as Firefox large numbers of users are sometimes willing to associate themselves with an application (and I’ve just noticed that the Twitter application page in Facebook has 10,106 fans).  So perhaps a decision to develop a Facebook application would be one made by the marketing group for a service. Or perhaps there is an expectation that a thriving support service can be developed within popular social networking environments, in which case the decision would be made by those involved in providing the support infrastructure for a service.

But perhaps, based on the experiences I’ve had, we shouldn’t expect too much in terms of the functionality which a Facebook application can provide.  Is this a limitation of Facebook as a platform, or is it simply that, as Nicola has said about the Suncat application, the service is still very basic at present and EDINA are still exploring how the application might be developed? Or might Facebook applications have a useful role to play, but only in certain application areas. Earlier this year Seb Chan, on the blog described the Artshare Facebook application, developed by the Brooklyn Museum (one of the pioneers in a number of uses of Web 2.0 services). As Seb described:

This allows you to add selected objects from museum collections to your Facebook profile. These object images then link to your museum’s collection records, the idea being that people can effectively ‘friend’ objects in your collection, promote them for you on their profiles, and drive traffic back to your website.

Are the benefits, then, in providing access to objects which can, in some way, drive traffic back to your service? Or could Facebook provide an environment for games which provide educational benefits (Scrabulous for remedial English teaching, perhaps?)  But are there any significant benefits to be gained, apart from the marketing aspects, from providing search interface to services from within Facebook?

Posted in Facebook, Web2.0 | Tagged: , | 12 Comments »

EDINA And Web 2.0

Posted by Brian Kelly (UK Web Focus) on 11 August 2008

I was recently reading the EDINA Newsletter. EDINA, a JISC-funded national datacentre based at the University of Edinburgh, has announced its strategic plan for 2008-2011(PDF) and amongst its priorities are “improving engagement with our user communities” and “appropriate use of Web 2.0 social media and collaboration tools“.

It seems that EDINA has already started implementing these plans, as the newsletter also describes the EDINA Digimap blog which has been launched as a way of “exploring alternatives to email for distributing information about the service“. It is interesting to note that the blog is hosted on Blogspot. This strikes me as a sensible – rather than having to find technical expertise in-house to install and maintain blog software EDINA are using a well-established and mature externally-hosted service. It was also interesting to note that they are using Blogspot rather than WordPress. I suspect that, after lagging behind a few years ago, Blogspot may have caught up with WordPress in its functionality and ease-of-use.

The newsletter also mentioned that the Suncat service (the Serials Union Catalogue for the UK research community) now has a “search application that anyone on Facebook can easily add to their profile, enabling them to search for journals held in over 60 UK research libraries” – and if you have a Facebook account you may wish to try the application.

Externally-hosted blogs and Facebook applications – it does seem that EDINA is embracing Web 2.0. And reading the strategic plan for 2008-2011 (PDF format) it seems this decision was made in order to enhance accessibility of its services.  The plan describes how “EDINA recognises the growing user-base arising from delivery of service to a widening client community and integration with other environments, especially those using mobile technologies. In addition, the growth in popularity of Web 2.0 social media and collaboration tools is important for the support of learning and research activity.” I was also pleased to read that although EDINA is committed to improving the utility and usability of its services for “the full range of its users, including those with disabilities” EDINA has acknowledged that

adopting too conservative an approach risks disenfranchising many users and therefore EDINA will evaluate how its services can be presented and personalised to address changing information-seeking and user practices, including access through devices other than computer screens, such as PDAs and mobile phones.

It is good to see a national JISC service such as EDINA embracing Web 2.0 and making a commitment to enhancing the accessibility of its services by providing personalised services and supporting a variety of devices (and it is noticeable that no reference is made in the plan to achieving such accessibility be simply mandating WAI-compliance).

Posted in Accessibility | Tagged: | 2 Comments »

Citizen 2.0, Strike 2.0, David Cameron 2.0 and Coldplay 2.0

Posted by Brian Kelly (UK Web Focus) on 8 August 2008

Last week’s New Statesman magazine (4th August 2008) had a special supplement entitled “Citizen 2.0″. As described in a blog post by Aleks Krotoski, Technology Correspondent of the Guardian and chair of the event this was a summary of a roundtable discussion on “Privacy, security and civil liberties in a digital society”.

The main article in the Work supplement of Saturday’s Guardian (5th August 2008) was entitled “Strike 2.0” and described how strike actions in the 21st century are beginning to make use of social networking services.

The Guardian also published a leader column on 16th July 2008 which was entitled “David Cameron 2.0“.

And a review of Coldplay’s “Viva la Vida or Death and All His Friends” album published in The Observer on 8 June 2008 described how “After three best-selling works which the piano-rock four-piece now consider a trilogy concluded, Coldplay declared themselves ready for Coldplay 2.0“.

The 2.0 meme is now established in mainstream journalism, it seems – well, perhaps only left-of-centre publications, although I haven’t read the Telegraph or the Mail for some time :-).

I wonder if the style guides for these publications has been updated to define how this term should be used? I am comfortable will use of the term in this way, just as I am when I hear terms such as ‘library 2.0‘,  ‘e-learning 2.0‘, ‘research 2.0‘, ‘enterprise 2.0‘ and ‘government 2.0‘ . And I am pleased that the Web industry has had an impact on the language which now seems to be becoming accepted within the mainstream media,

An earlier attempt by the Web community to describe a new generation of technologies was the suffix NG, which was used, for example, to describe HTTP-NG. I have to admit that I’m please that coining of this term by fans of Star Trek failed to take off.

In the political sphere we have seen the term ‘New’ being used to describe the different approach which was taken by the Labout party in the mid 1990s. We subsequently saw the term ‘modern’ and ‘moderniser’ being used to describe the response being made by the Conservative party. Now although I suspect many readers won’t describe themselves as fans of ‘New Labour’ or the modernised Conservative party it should be acknowledged that these terms were widely used and understood, even if they did not have a rigourous definition.

And for me it’s just the same with Web 2.0, e-learning 2.0, Library 2.0, etc. Let’s get over debates about these broad terms and instead discuss the issues.

Posted in Web2.0 | 3 Comments »

IWMW 2008 Bar Camps

Posted by Brian Kelly (UK Web Focus) on 7 August 2008

The main change to the IWMW 2008 timetable this year was the introduction of a barcamp session.  As described on the IWMW 2008 Web site:

Wikipedia defines BarCamp as an international network of user generated conferences, open, participatory workshop events, whose content is provided by participants. A BarCamp is typically one or two full days held at a weekend attended by people with an interest in technology. The day is split into a number of sessions typically of around 30 minutes each. Depending on the number of participants, size of venue, etc. there may be several sessions running simultaneously.

For the IWMW 2008 event we still had the conventional plenary talks and parallel sessions which had been planned in advance. But in addition:

A board [was] provided at IWMW 2008 for people to post up ideas for slots, rooms will then be allocated. Screen projectors will be available in rooms for people to use. During the 45 minute allocated slot there will be time for up to 18 sessions and each session will be 20 minutes long.

This innovation was introduced by my colleague and IWMW 2008 co-chair Marieke Guy, with suggestions from Michael Nolan, Edge Hill University, who shared his experiences of barcamps: “One of the best presentations I’ve seen was titled “stuff I know” and was a guy drawing shapes, arrows and random words on a flip chart while telling us what we should know…“.

And having just had my first glance at the IWMW 2008 feedback forms it seems that the Barcamp idea was a great success.

The Overall views for the event included the comments “Bar camp was an excellent idea that should be utilised more in the future” and “Bit disappointed by the main session but the parallel/barcamp sessions were much better“.

Comments on the Most Valuable Aspects of the Event included
Barcamp and discussion with others and seeing how successfully people have implemented successful change over the last year“, “Barcamp sessions“, “Barcamp” and “Barcamp” :-)

We were also keen to get feedback on Aspects Which Could Be Improved. Even the responses to this question were all positive about the barcamps: “Bar camps a bit rushed. The session were not too long but changeover times took too much out of 20 mins, More barcamp stuff please-lets build stuff!“, “Barcamps not long enough” and “Not enough time left between barcamp sessions to get from one room to the next“.

The Barcamp Topics

The barcamps were clearly a success. But what topics were covered? A list of the topics is provided on the IWMW 2008 Web site and is also given below. And note that a page has been created on the IWMW 2008 Ning social network which will enable the barcamp facilitators (and, indeed, the participants) to provide a summary of the session, notes on the discussions and links to relevant resources.

Session1: Wednesday 23rd July 2008 from 14.15-14.35

  1. Sex, Lies and Microsites [see Ning page]
  2. So What Is A Good Open Source CMS?  [see Ning page]
  3. Stuff You Need To Know About iTunesU [see Ning page]
  4. How Can A WCMS Save £3.4 Million In 12 Months? [see Ning page]
  5. Tenish 5-Minute Ways To Improve Your Website [see Ning page]
  6. Web Analytics Guiding Web Development [see Ning page]
  7. Web 2.0 In Student Activism: What We Can Learn From Anonymous [see Ning page]
  8. How Qualified Do You Have To Be To Manage A Website? [see Ning page]

Session 2: Wednesday 23rd July 2008 from 14.40-15.00

  1. Canadian View On Life, Dearth and Social Software [see Ning page]
  2. DIY CMS – Building A Low Budget System, Getting People To ‘Buy-In’ [see Ning page]
  3. Immediacy WCMS In Action [see Ning page]
  4. T4 CMS / Sitestat / Redesign / Rambling Q&A / Discussion [see Ning page]
  5. Barriers To Making Things Work On Second Life [see Ning page]
  6. Simple Scriptaculuous [see Ning page]
  7. Forum: Feedback on Nedstat [see Ning page]
  8. Migrating Into A CMS – What Is Your Experience? [see Ning page]
  9. Live@EDU [see Ning page]

Of course, as the barcamps were fairly informal and may have been provided on an ad hoc basis, there is no requirement for the facilitators to provide such resources, but I think it is useful to have a record of the sessions which were held and to provide an opportunity for those who may wish to have a summary of the session to do so, without myself or Marieke acting as a bottleneck to the creation of such resources.

Posted in iwmw2008 | Tagged: | 3 Comments »

The ‘Chat’ Infrastructure At IWMW 2008

Posted by Brian Kelly (UK Web Focus) on 5 August 2008

The First Use Of Realtime Chat An IWMW Event

The IWMW 2005 event held at the University of Manchester on 6-8th July 2005 was the first time that a WiFi network was used at UKOLN’s IWMW annual event. I had attended the EUNIS 2005 conference a few week’s prior to this and presented a paper on Using Networked Technologies To Support Conferences. This paper described the potential benefits which networked applications could provide to what Lorcan Dempsey subsequently described as Amplified Conferences. As described in that paper we ensured that we described the technologies which would be available at the IWMW 2005 event and provided an AUP (Acceptable Use Policy) covering use of the technologies.

I think there were less than 20 participants who made use of the event ‘chat’ infrastructure, which was provided by IRC (Internet Relay Chat) and those taking part were mainly Web managers ho had a very technical focus, as can be seen from the IRC archives. The nature of the discussions changed, however, on the second day of the event, the 7th July 2005 or, as it became known 7/7 – a date that (fortunately) is not as globally significant as 9/11 but, especially for those with London connections, a date which will be associated with the London Bombings.

It was a very surreal experience following a message on the IRC channel about was was initially reported as a train crash on the London Underground, and the subsequent discussion.

Jul 07 10:08:02 <Tim>explosion on london underground. entire network closed!!
Jul 07 10:09:04 <–DavidBailey has quit (Quit: CGI:IRC (EOF))
Jul 07 10:10:06 <JeremySpellerUCL>explosion where?
Jul 07 10:10:15 <Tim>liverpool street
Jul 07 10:10:35 <JeremySpellerUCL>Grief
Jul 07 10:10:40 <Tim>metropolitan line, two trains collided, several wounded
Jul 07 10:10:58 <Stuart_Steele_Aston>Tthe bbc site is grinding?
Jul 07 10:11:02 <JMHarmer>bbc news site not responding – u saw the news report? prrsumably everyone else is trying to now.

The launch of a WiFi-enabled IWMW event will be one that will be remembered for a long time  by those who took part in the discussions on that day.

The ‘Back Channel’ At IWMW 2008

Moving forward to IWMW 2008 we knew that many of the participants would expect a real time communications infrastructure to be provided, as this has been the norm at IWMW and many other UKOLN events since 2005. And as we were video streaming the plenary talks we expected to have remote participants joining in the discussions, too.

Over time the terms used to refer this technology has developed. Use of the term ‘chat’ has decreased, in part due to its derogatory connotations but also due to a move away from IRC to move native Web-based communications technologies. I have heard the term ‘back channel’ being used, and this term works when it is used if (as was the case with Ewan McIntosh, the final plenary speaker at IWMW 2008) it is used to provide realtime feedback to a speaker.  But more commonly the realtime communications technology is used by the audience (both those physically present, those watching a video stream and also, in some cases, those who may only have access to an audio stream or are viewing the PowerPoint slides). The term ‘micro blog’ has also been used (indeed this is how I described the service on the IWMW 2008 Web site) but that suggests a official commentary on an event, rather than the discussion forum which was how the service was actually used). I don’t think there is yet a widely agreed term to describe this, so for now I’ll use the term ‘back channel’.

Since IWMW 2007 Twitter has become very popular in certain circles, and most IWMW 2008 participants will have heard of it, even if they weren’t Twitter users. However we decided not to suggest use of Twitter as the event back channel, as, when I’ve tried this previously, I’ve found it is too intrusive those who follow me on Twitter who aren’t at the event or aren’t interested in the event.

There was a need for a tool, I felt, similar to Twitter, but which was less intrusive. I had some experience of Coveritlive (at events such as the eFoundations Symposia – although I haven’t been able to find the archive of the discussions). However I found a number of niggles with that software, including the need to (normally) approve comments.   In response to a tweet for alternative suggestions I decided to make use of Scribbeitlive.

This did have some advantage, but also some weaknesses. As Andy Powell commented on the eFoundations blog:

My feeling is that ScribbleLive makes better use of screen real-estate.  On the other hand, Coveritlive has better bells and whistles and more facilities around moderation (which can be good or bad depending on what you want to do).  In particular (and somewhat surprisingly), Coveritlive handles embedded URLs much better than ScribbleLive.  Overall, my preference is slightly twoards Coveritlive – though I could be swayed either way.

In response to Andy’s post Matt Jukes and Phil Wilson suggested that neither tool was ideal for the job. I would agree with this – I think we will see much development in this area, not only in enhancing the usability of the tools but also in allowing the data to be more easily integrated with other tools. I would like, for example, to be able to have tools to allow me to export the data to other environments (I have migrated the content to the IWMW 2008 Web site, but I had to do that manually). It would also be useful to be able to link comments with particular presneter’s slides or the video – without having the disucssion having to be tightly-coupled with the multimedia experience (as seems to be the case with, for example, the Elluminate service).

Another comment Andy made was “the importance of having someone in the venue dedicated to supporting remote participants “. Again I would agree with this. This was an area I had responsibility for – but found that I was not able to do this at the start of the second afternoon due to difficulties in connecting to the WiFi network. I also found myself failing to support the remote participants during Ewan McInitosh’s talk because I found it so interesting! But if we do need dedicated support for remote participants there will clearly be a cost in providing this support. Does this mean we should start to charge remote participants, I wonder?

Posted in iwmw2008 | 3 Comments »

Why Don’t Members of Institutional Web Teams Blog More?

Posted by Brian Kelly (UK Web Focus) on 4 August 2008

On the second day of the IWMW 2008 event Michael Nolan made the commentIf people are saying we need to communicate what we’re doing better, why do so few Web Services depts have a blog?” on the event’s live blog.

Shortly after getting back from the event Michael, a Web developer at Edge Hill University sent a message to the website-info-mgt JISCMail list in which he raised this issue with a wider audience:

At the risk of opening myself up to (probably deserved) flaming and accusations of blatant self promotion, I’ve posted to the Edge Hill Web Services blog questioning why so few other university web teams have a blog:

http://blogs.edgehill.ac.uk/webservices/2008/07/28/blogging-web-teams/

Comments and feedback welcome!

This led to a discussion on the list – and also responses to Michael’s blog post on the Edge Hill University Web Service’s team blog.

On the mailing list various reasons were suggested for the lack of blogs by members of Web teams :

  • I find it hard keeping up with blogging – reading and writing, [because] I’m too damn busy with other projects.
  • … our workload is so great that this sort of activity tends to sink to the bottom of the list.
    Does anyone think that such blogs would add any value over and above resources such as this list? … So, to turn the question on its head, who thinks that they could benefit from reading another web team’s blog?
  • If every one of us blogged about our work, it would be very hard to sort out the chaff.

Other replied arguing the benefits of blogging suggesting the benefits of the ‘long tail’ (an obscure blog post on the intricacies of XSLT coding is likely to be of interest to perhaps small numbers of others) and how use of filtering tools should help such nuggets to be found by interested parties. Janet McNight at the University of Oxford also suggested that:

I think there’s a feeling that a ‘blog’ has to involve sustained pieces of writing, well-crafted prose, etc; when really all it needs to be is “I was wrestling with [some problem] and found [some neat
solution]: [lines of code, config, whatever]” — or “we’ve been looking into [some new technology] and these are a few of the thoughts we’ve had so far”.

I would very much agree with Janet’s comment. I feel there is a need to regard a blog as a communication rather than a publication medium. After all, many members of Web team who may be reluctant to blog are willing to make use of email lists for advice on often obscure problems – and, ironically, mailing lists tend not to have the richer structure content and software tools which can help people to filter out content which is of no interest and find the material which is.

The comments on Michael Nolan’s blog were, perhaps unsurprisingly, somewhat critical of the failures of institutional Web teams to embrace blogging (Michael has found only 4-5 examples of such blogs). Matt Machell, for example, commented that:

it often surprises me how insular the HE web development world is. It seems to talk to itself, but not to the wider web professional community

Alison Wildish responded on both the Edge Hill blog and the website-info-mgt mailing list with some considered views on the matter. She identified some of the barriers to blogging (and note that I will link to her comments on the blog as this is both easier to read, more navigable and has more easily cited URIs than the JISCMail archive) but she still felt that “there aren’t enough of us [blogging] for people to see the real value – yet! If more of us used blogs then we’d be able to gain a real picture of the work going on across all Universities“. Alison went on to list the benefits University of Bath Web Services blog are providing.

But although I would agree with Alison’s views I think there are dangers in forcing people or teams to blog (I should hasten to add that I’m not suggesting that Alison is saying this). I still feel there is a need to discuss the benefits and to gain a better understanding of best practices – and the associated dangers. And I did wonder whether, as many members of institutional Web teams are happy to contribute to mailing lists whether an email blog service, such as Posterous, might provide a lightweight approach to blogging – with this service you simplky send an email to create a blog post, which, of course, has the ‘cool uris’ and usable RSS feeds which JISCMailo lists fail to provide.

But if an email blog tool is still to heavyweight, perhaps another approach might be microblogging. We are, after all, seeing such conversational use of Twitter being used to discuss the pros and cons of team blogging, with the advantage that posts have to be kept to the limit of 140 characters – in this case, as partly illustrated, Michael Nolan raised the issue on Twitter initially, Paul Walk suggested some of the possible difficulties, Mike Ellis, with tongue in cheek, questioned whether Web managers had anything to say and Michael Nolan delivered the punch line :-)

In the screen shot shown above there are six tweets, ~ 6*140 bytes and three twitterers discussing the issue (there are only 5 active blogs, reasons why this may be, a challenge to the reasons and a witty riposte). Short and sweet :-)

But more seriously I think there are roles for a diversity of communications tools including email lists, blogs and micro-blogging tools: each will have its own strengths and weaknesses, but we need to experiment and gain experiences in order to find out what the strengths may be. And to revisit Michael’s original reflection on the need for members of Web teams “to communicate what we’re doing better” can it be really suggested that email lists are sufficient?

Posted in Blog | 4 Comments »