I was looking at a page on the W3C Web site recently to update my knowledge of the SVG specification and SVG tools. I noticed a link at the bottom of the Scalable Vector Graphics (SVG) page to an RSS feed for the page, and, as a fan of RSS syndication, thought it might be worth adding this feed to my RSS viewer. However when I clicked on the link, rather than seeing the RSS feed and having the option to add this to my preferred RSS reader, an error message was displayed:
Now validating this RSS feed with the RSS validator on the W3C Web site informs me of an error with the feed:
It seems that either W3C’s workflow process has failed to removed the registered trademark character for the term “Internet Explorer®” or the RSS schema has failed to included a declaration for this character entity.
No big deal, you may think – and, as the page isdisplayed in the FireFox browser, this is surely another failure of Internet Explorer to follow Web standards.
But if you view the page in Opera you get an XML parser error message:
And here, I think, both Internet Explorer and Opera seem to be obeying the requirement that user agents aren’t expected to render non-compliant pages.
And this hard line approach has been promoted as a vision of the future of the Web by the W3C. It has been argued that mandating rigourous compliance with specs would help to maximise interoperabilty.
This may be true – but at what cost. As someone who studied engineering at University I am aware of the benefits of a fail-safe approach to design, so that if one small component fails it doesn’t mean that the building will collapse. But in this case one small component (the trademark character entity) which hasn’t been properly defined, has led to a total failure for the page to be rendered in two browsers.
Don’t we need Web resources to be designed so they’ll fail gracefully and will be tolerant if humans make mistakes or, as it seems is the case here, there are failures in the workflow?