Wednesday 8 February 2012

Augmented Reality

Objects and Gestures. How can these be linked? The answer is Augmented Reality.

Rather than using a keyboard and mouse to control a keyboard, how can we control a computer in a different way, without holding anything, or physically touching anything?

It is all about bringing a part of the physical world to the digital world. As humans, we are not interested in computing. We are interested in information. To gain the information, we need to interface with the computing. So by streamlining this, bringing both closer together, using Augmented Reality, we can break the confines of a mouse, a keyboard and move toward more intuitive interface devices.

Panav Minstry has thought at length about these ideas. His product, Sixthsense, utilises a camera and a projector to create a wearable gesture interface. The concept is streamlining how information can be accessed through everyday objects, becoming more connected to the physical world and in fact, using less "traditional" technology. A page is projected onto a blank wall or surface and the fingers and hands can be used to interface with this, tracked by the camera. A simple gesture can open up a web browser. Gesturing a tap can open Google Maps. Pinching with one hand, or even two, can zoom in and out maps. Extended to pictures, users can crop, rotate and align their holiday snaps on any surface, in any country. Email any of them with a straightforward and more importantly, intuitive, swipe of the hand.

Placing a hand in front of the projection to act as a screen, a number pad can be displayed. With the other hand, a phone number dialled and the call put through. Removing the need for a mobile phone, an iPad and a laptop. Those are the technologies of the past in comparison with this. The only drawback is that this technology is a good few years away yet before any sort of commercial availability. Whereas the electronic devices we all use are available now and we all have them.



No comments:

Post a Comment