What I Noticed For The First Time In The Past 24 Hours
Posted by Brian Kelly on 26 February 2014
Techniques for Predicting Future Trends and Their Implications
Back in October Tony Hirst and I co-facilitated a day-long workshop session on Future Technologies and Their Applications. Mechanisms for predicting future developments and being receptive to the possibilities and implications of technological and societal developments has been a long-standing area of interest to me.
Back in 2007 in a post entitled The History Of The Web Backwards I was inspired by the “History of the World Backwards” comedy series on Radio 4 programme to describe the demise of the web from the data of the blog post to its extinction on the early 1990s. The aim of that approach was to provide different insights into technological developments. Two years later in a post on Forecasting Trends Backwards I described a YouTube video entitled Romancing Your Soul Absolutely Brilliant! which provided another take on time travel: it began with a young woman’s dismal view of the implications of technological developments which concluded “And all of this will come true unless we choose to reverse it“. The talk was then reversed to provide an optimistic view of developments. If you’ve not seen it before I’d recommend spending 1 minute 44 seconds to watch it (there have been over 201,000 views since the video was uploaded in October 2009).
In our Future Technologies workshop Tony Hirst introduced me to a new technique for helping to spot technological developments and reflect on their implications. As Tony described in a post which asked “What did you notice for the first time today?” this question “can be important for trend spotting – it may signify that something is becoming mainstream that you hadn’t appreciated before“.
Tony went on to give some examples of how he uses this approach:
I’ve started trying to capture the first time I spot tech in the wild with a photo, such as this one of an Amazon locker in a Co-Op in Cambridge, or a noticing from the first time I saw video screens on the Underground.
In a post in which I gave my thoughts on this technique I posed the question slightly differently: What Have You Noticed Recently? and went on to comment on developments I’d observed in recent months (e.g. badges for gaming activities; evidence of use of mobile devices in bed; WiFi on buses and making payments using a mobile phone).
Providing examples of technological developments you have observed today is more challenging – especially if you noticed the developments at 8pm! This was when I was struck by something I had not come across before, so I’ll keep to the spirit of Tony’s methodology but tweak it by commenting on “What I Noticed For The First Time In The Past 24 Hours“.
What I Noticed For The First Time In The Past 24 Hours
Last night I went to the Odeon Cinema in Bath. The advertisements included one which encouraged viewers to download the Cinime app (available on the iPhone and Android market places). I installed the app on my Galaxy Note phone and started to use it during a number of further advertisements which were shown on the screen. Unfortunately as I had to download the app over a slow 3G network I wasn’t able to play the computer games during which you could interact with the display on the cinema screen. However I was able to hold my phone up to the screen and receive further information about a trailer which was displayed.
Wondering How It’s Done?
On my way home I speculated on how the app might work. I had stated that I was in an Odeon cinema when I launch the app so it had some contextual information about me. But did it know which cinema? If not, how would it relate my responses to the quiz displayed on the screen? Perhaps there are only a fixed number of quizzes?
However the quizzes were quite simple. I was more interested in how taking a photo of a trailer shown on the cinema screen would provide information about the film. Was there some clever pattern recognition (there didn’t appear to be any QR code or equivalent code visible on the screen)? Or perhaps, I thought, the app might be processing the audio; after all apps such as Shazam and Soundcloud are able to recognise popular music.
How Was It Done?
An article in The Next Web gives some hints as to how the app works:
Cinime uses audio watermarking and image recognition technology to enable users to unlock brand and film-related content on their phones. During the interactive quiz, cinemagoers are invited to answer a series of questions displayed on the silver screen, questions that are tailored to different audiences and movies. If they get two or more questions correct, they can redeem a PlayStation-sponsored prize after the movie or during their next visit.
So both audio watermarking and image recognition are used, but more detailed information is not provided. Interestingly as a Google search for “how does cinime” is automatically expanded to “how does cinime app work” suggests I’m not the first to ask this question.
A Change in the Culture in Cinemas?
As I held my phone up to the screen and took a picture of the screen (as illustrated) I felt somewhat self-conscious. Previously adverts had asked cinema goers to switch off their mobile devices, with adverts highlighting how embarrassing it could be if a phone went off while a film was being shown. But now cinema goers are being encouraged (indeed bribed, with prizes being offered for those who complete the quizzes) to use their mobile phones.
How Could Such Approaches Be Used In Other Contexts?
However rather than wondering how the app works or the implications of the culture change, my main interest was in how such an approach could be used in an educational or cultural contexts. That won’t be an issue I’ll address in this post, although I’d welcome suggestions.
A Portfolio of Techniques
This post has been primarily about how the question “What have you noticed for the first time in the past 24 hours?” or “What have you noticed for the first time recently?” can be a useful tool in future-planning workshops.
Lat year at the CETIS 2013 conference I took part in a session on the “Future of CETIS” in which Paul Hollins made use of the Delphi process to “identify emerging trends and the future technology landscape in education and predict as a group what technologies will have most impact on the short, medium and longer term in Higher Education in order to prepare institutions for the challenging future which awaits them“.
CETIS have also been involved in the EU-funded TELMap project which looked at emerging technologies and practices in educational technology which collected perception of timeline, potential impact, feasibility and desirability in respect of developments that are not currently mainstream.
I, together with my Cetis colleagues, will continue to explore ways of engaging with our communities in seeking to predict innovative developments and plan for their implications. The resources used for the workshop on Future Technologies and Their Applications are freely available under a Creative Commons licence. I intend to make further use of the “What have you noticed for the first time recently?” technique in future workshops, alongside use of the Delphi process which Paul Hollins and I described in a paper on “Reflecting on Yesterday, Understanding Today, Planning for Tomorrow“.
But in addition I’d welcome suggestions on other approaches which can help in providing new ways in predicting and planning for innovation. Feel free to leave your suggestions in the comments. I also invite comments on things you may have noticed for the first time recently – with bonus points if you noticed them today! You could even share your observations on Twitter using the #whatInoticed tag.