
Touch technologies are evolving to the point where the environment itself could be used as a ubiquitous content interface, without the need for devices.
Reon Coetzee, Toshiba regional sales manager for southern Africa, says touch is another step in the logical evolution of user interfaces. “Until now, shortcomings in technology have forced us to use buttons, keypads and joysticks to interact with our devices. And while touch has been around for decades, the technology itself just wasn't delivering on the expectations of users,” he says.
“Touch is meant to give users the most natural user interface of all (second only to mind-control). You see something, you want to touch it. You don't want to use a keyboard to touch something 'virtually' and then be told what it feels like,” adds Coetzee.
“Now that touch technology is maturing, manufacturers will start building devices that exclude keyboards and keypads, to create smaller, more compact devices that put games, media and information, literally, at our fingertips.”
Steve Prentice, VP and fellow at Gartner, notes that the whole input format is changing. “For the past 30 years, we've had the concept of a mouse and a keyboard, which is very 2D. As we've moved to notebooks and other mobile computing options, a big trend over the past few years has been touch pads and multi-touch, along with more targeted movements like pinch-and-squeeze.”
“The landscape of mobile devices is going to change drastically in the next five years,” says Coetzee. “Equally, developers who make the applications we use on these devices are going to have a great time innovating around this new way of delivering content and interacting with it.”
Moving experience
In the next few years, predicts Prentice, gestural interfaces for gaming will become hugely popular, and spill over into other uses of the technology.
Adrian Drozd, Frost & Sullivan principal analyst for the telecoms group, Europe, says motion-controlled gaming, which hit the mainstream with the release of the Nintendo Wii, is rapidly gaining pace.
He points out that both Sony and Microsoft are set to introduce motion-control offerings in the next six months, with Sony's 'Move' controller debuting in September, while Microsoft's 'Kinect' will reach US customers in November.
“While Sony's controller works along the same lines as that of the Nintendo Wii controller, Kinect tracks users' movement. Such a controller-free system opens up new possibilities for compelling game play, although this development will provide a new challenge to games developers,” says Drozd.
Prentice notes that gaming systems will start to incorporate front-facing video cameras detecting movements in 3D, whereby the computer 'understands' movement. He says PC gaming will become a much more immersive experience, where the hand or body acts as a controller. “With no intermediate device between you and the computer, it will bring a whole new sense of play.”
Touching tomorrow
Moving out of the pure gaming space, Prentice predicts everyday devices will also become less discrete. “Instead of carrying around a physical object, people will have them embedded in their handbag or jacket, which will be activated by putting it on,” he explains. “The way of controlling a device will become entirely intuitive, so instead of having to switch functions on or off, it will respond to gestures you make.”
Kari Pulli, research fellow and leader of Nokia's Visual Computing and Ubiquitous Imaging team, sees a long-term scenario where the user can interface with their environment through the seamless merging of physical and digital worlds. “This involves accessing information about the things you see, to allow you to do various tasks,” he says.
“In the home environment, for example, you could control things like the TV or lights simply by pointing at them with your finger. Gestural interfaces will become powerful enough that the need for a keyboard falls away.”
He says this future system could take various forms. “One is a handset, so the camera, display, communications, GPS is all on there and it acts as a magic lens to the environment.” An interface system could also function via goggles or glasses, but Pulli adds this still problematic in terms of commercial development.
“People want something that's wearable, and there are still some technology growth pains, such as making it lightweight. Also, if you want the system to be wireless, you're going to need power, which requires a battery that takes up space and weight,” he explains.
“If you settle for using something wired, so there's a device in your pocket with the battery, that's something we can expect to start seeing on the consumer market in the next five years.”
Machine intuition
A good example of this kind of 'interface-in-your-pocket' device is the Massachusetts Institute of Technology's Sixth Sense prototype. It comprises a camera, mini-projector and a mobile device, which acts as a computer and connects to information stored on the Web.
According to its creators, SixthSense “is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information”.
Prentice explains that Sixth Sense involves a mash-up of existing technologies to create an intuitive relationship with gestures. For example, the user can make a square with their hands to indicate photo-taking or draw an '@' sign in the air for e-mail and the device will respond to that.
“It allows any surface to be turned into an interface,” notes Prentice. “In future, you're going to see devices disappear; they'll still exist, but not in the form they are today.”
Pulli points out that from the time when computers started being used, the aim was to make them more user-friendly. “That's the direction we're going, to make computers more intuitive to use so the machine interfaces with people the way people interface with people.”
He says this would enable a computer to understand and respond to things like facial expression, eye contact, gestures and voice. “So you could get a computer to relate to you the same way as a good friend does, who with a single movement knows what you mean. The goal is to make everything more transparent.”
According to Coetzee, the combination of maturing technology, innovative device manufacturers, and talented content developers is driving a demand for more immersive technologies.
“Users are always on the lookout for the 'next big thing' and these technologies have matured to the point that they fulfil that need.”
He stresses that entertainment is only one application where immersive technologies have a potential impact. “We haven't even started to realise where things like 3D and touch will take us, but with a bit of imagination, you can see that there are applications in medicine, manufacturing, telecommunications, law enforcement, sport, childcare, and a whole lot more.
“The truth is that - aside from entertainment - there's going to be a lot of trial-and-error when it comes to adopting these new technologies in other vertical segments, and we may even discover applications that we never thought of.”
Share