contextual and ambient apps – on{x} beta

Yesterday saw the release of on{x} for Android phones from Microsoft Isreal. If you’ve ever played around with if this then that, then you’ll understand the concept of creating recipes to ‘make the internet work for you’. That is, you set triggers (like a tweet, a new item in an rss feed, or a new instagram photo) to set off an action (such as sms an alert, auto-post to Facebook and Twitter, or save your photo to a dropbox). Whilst this is a nifty little service, that is in essence a visual interface for mashing the APIs, what on{x} does takes this concept to the next level.

The key is in the types of triggers that on{x} offers you, including geolocation and movement. By tracking where you are, and how fast you are moving, it can determine whether you are walking, running or in transport and act accordingly. Some simple examples of recipes include ‘Text my wife ‘on my way home’ when I leave the office after 5pm’, or ‘start music app when I’m running’.

In my mind, the two key elements that work with this app are the use of context and ambient processing. Contextual apps have been big, especially with apps like Foursquare continually stepping up their game to offer users relevant, interesting data depending on their location, and this year’s SxSW darling set, including Highlight, glancee and Ban.jo, giving users feedback on people around them based on location. However, it’s the ambient side of things that got me thinking. The concept of apps doing things you want them to without you interacting with the app based on the context is, in my eyes, one of the big future areas of app development. It all lends itself to this idea of the ubiquitous network of devices, that constantly tracks, interacts and feeds back relevant information when you want it to, to make technology that helps you in day to day life as non-invasive as possible.

This can be used for simple tasks, like texting people when you’re running late for a meeting or checking into Foursquare when you’re in a place for more than 5 minutes, or simply for self-tracking to monitor everything you do and analysing it at a later date (this is already starting to happen. Check out the ‘quantified self‘ movement).

What do you think? Is this the future of app development?

Published by Luke

Luke is one of Ubelly’s resident social media guys, occasionally switching hats for a bit of design. He is the in-house meme expert, uses foursquare a little too much and gets hot under the collar when it comes to design, usability and gorgeous code.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>