Over the last few years we’ve seen some jaw-dropping Augment Reality demos. Here’s a YouTube playlist of some of my favourites:
I love watching these and enjoy imagining me being in a situation where one of these AR apps can save a life, save the world, or just get me to the nearest train station before I have to get a taxi home. I genuinely watch those videos and think, “I could use that, I could be in that very same situation and when I am, I’d pull out my AR app and the world would once again be a better place.”
I’d love to service my own BMW and with AR apps like that I don’t feel that’s such a ridiculous idea. However, it’s less likely that I’d surf the streets on my phone trying to find a bar that sold Stella Artois. And there-in lies my problem.
My problem with AR experiences is that I don’t want to use them in public. And you will no doubt be quick to point out, that’s a pretty major problem. I’d use the BMW app in the privacy of my own garage, but I’m not going to do it in public. Put rather more politely than I could ever manage by a colleague of mine recently “it makes you look like a right Charlie”. Have you ever seen someone stop whilst walking down the street and hold their phone in front of them as if they’re looking through it into another dimension? I haven’t and I walk the streets a lot. Is it because it takes us outside of our comfort zone? Even the swathes of Hoxton Hipsters who are so quick to jump in and ride the latest technology bandwagon are able to exercise self-control when thinking about taking out their smartphones and browsing the streets like the device is a window to another world. Maybe because it’s a public admission of a gap in our knowledge – who would like to admit they’re actually a tourist? Maybe it’s for security reasons – “The tourist guide told us to be wary of street crime!” Maybe it’s such an obvious interaction with AR that we don’t want to expose our obviousness to others?
Why do we have Augmented Reality?
As I often say, technologists aren’t great at coming up with new ideas of their own, they look at science fiction and think ‘that’s pretty cool, I’m going to build it’. We did it with Hover Boards, the Jetson’s car and Star trek Communicators. So I blame Star Trek. I blame concept cars and their head up displays. I blame that scene from Star Wars when R2D2 let loose that Obi Wan was the only hope. I’ll go back further and blame the Ancient Greeks and the Oracle at Delphi. Anything that removes us in any way from the reality that we actually live in can be added to this list.
AR visionary Keiichi Matsuda has released his vision of future AR in video form over the last few years, one example is in the playlist above, but here’s another which requires you to put on some 3D glasses to fully appreciate the interface that may one day surround us.
Ethics. Who owns your logo? Who owns the space around us?
When you look at an environment through the screen on your mobile phone, is that a real space you’re looking at? What is to stop a vigilante from taking the original image and mis-representing it through Augmented Reality? This happened to BP as you can see in the video here. This app let’s users see the BP Deepwater Horizon oil spill when ever they see a BP logo. The user simply launches the app and aims their phone’s camera at a BP logo and the logo is transformed on screen to a broken pipe with oil pluming upward. I can’t comment on the legalities of this, but it is worrying that it could happen to anyone.
Making AR discreet in our lives
Advances in display technology
At the moment we’re still relying on our phone screens to display the additional information that we can add to our environments. Recent advances have been made in display technology that would make this a more subtle experience. Glasses that have digital displays on their lenses are slowly becoming reality as we’re able to shrink the technology to an acceptable size. Contact lenses were once an alternative to glasses but we are now seeing some useful technology being added to them. We are a long way away from having head up displays in lenses and there are many companies working on this end goal removing the need for bulky eye wear.
Futurologist Michio Kaku says ‘In the next 20 years, the internet will be in our contact lens. We will simply blink and we will be online, and when I look at you I will see your biography right next to you because my contact lens will know exactly who you are. You will always know who you are talking to and if you speak to me in Chinese, no problem. I will see Chinese subtitles translated into English under your picture.’ Slightly scary stuff. I believe him.
On a more realistic note, Vuzix are making waves with their eyewear (pictured above) as long as you’re comfortable with the price.
Advances in input devices
The Kinect as a walking aid demo from MIX11 this year is a great example. The hyperbole of this was lost on many people there (I overheard conversations between people who were complaining about how awful the visually impaired were being treated). If anyone ever goes to their local Specsavers and is told to gaffer-tape a Kinect unit to their head instead of being subscribed glasses then I would see that as equally a success and a failure. The sartorialist in me would snigger, the functionalist in me would sigh.
Kinect is the next opportunity that we have to interface with these alternate realities. We no longer have to hold up a device to browse our world, we can instead interact with it on a more natural level, through voice and gesture. Take the example of Kinect powered World Wide Telescope – this was the first application that I really felt was intuitive and obvious, it was natural to use and navigate. It works. Over the next few months I fully expect to see more shining examples of Kinect controlled AR as the creative developer community take advantage of the Kinect SDK for Windows that was recently released.
Is there an issue of AR standards?
AR experiences on mobile devices are currently outside of the browser. There is no standard that browser developers can adopt and build in as part of their software. There are many SDKs that developers can use to create these experiences and with the release of the Mango software update for Windows Phone 7, it is now possible to create AR applications across all mobile operating systems so these experiences can now be enjoyed by more people on more devices. However, I’d like to see AR implemented as part of the web browsing experience so I don’t need to jump out to specific applications for information.
I’ll believe it, when I don’t see it
I believe that I walk the streets like a vampire ninja, invisible and casting no shadow. I am un-noticeable and silent in the way I move. I embrace all technology both useful and useless but I feel uncomfortable with anything that makes me stand out in a crowd and currently that’s most forms of AR.
Looking forward to the future, as these sensory devices become less obtrusive to our natural environment and as we are able to provide input and receive output through discreet technologies, we will start to fulfil the vision of the early pioneers of AR. As our intentional interaction becomes less obvious to the outside world these augmentations will become more popular and more prolific. We’ll be able to subtlety request and receive information on everything we see, it will be thrown at us with the added benefit that we can pick and choose to know as much or as little about our current environment as we like.
For now, I’m sorry, but I can’t. I am technology’s proudest user but right now this just doesn’t work. The idea is genius. The execution based on the technology available to us is also, genius. But until the illusion of AR is truly magical and invisible, I can’t play this AR game.