Interfaces
Researchers have developed a new app enabling users to operate their smartphone with gestures. This development expands the range of potential interactions with such devices. |
|
It does seem slightly odd at first: you hold the phone in one hand, and move the other in the air above its built-in camera making gestures that resemble sign language.
Sometimes you move your index finger to the left, sometimes to the right. You can spread out your fingers, or imitate a pair of pliers or the firing of a pistol. These gestures are not, however, intended for communicating with deaf people; they are for controlling your smartphone.
"While touch input works well for many scenarios, we demonstrate numerous interaction tasks such as mode switches, application and task management, menu selection and certain types of navigation, where such input can be either complemented or better served by in-air gestures." |
All this gesturing wizardry is made possible by a new type of algorithm developed by Jie Song, a Master’s student in the working group headed by by Otmar Hilliges, Professor of Computer Science at ETH Zurich. The researchers presented the app to an audience of industry professionals at the UIST Symposium in Honolulu, Hawaii.
The program uses the smartphone’s built-in camera to register its environment. It does not evaluate depth or color. The information it does register – the shape of the gesture, the parts of the hand – is reduced to a simple outline that is classified according to stored gestures. The program then executes the command associated with the gesture it observes. The program also recognizes the hand’s distance from the camera and warns the user when the hand is either too close or too far away. It works like making the smartphone a Kinect Sensor.
Related articles |
"Our goal is not to replace the touchscreen as primary input device, but rather to augment and enrich the existing interaction vocabulary using gestures," claim the researchers. "While touch input works well for many scenarios, we demonstrate numerous interaction tasks such as mode switches, application and task management, menu selection and certain types of navigation, where such input can be either complemented or better served by in-air gestures."
The program currently recognizes six different gestures and executes their corresponding commands. Although the researchers have tested 16 outlines, this is not the app’s theoretical limit. What matters is that gestures generate unambiguous outlines. Gestures that resemble others are not suitable for this application. “To expand its functionality, we’re going to add further classification schemes to the program,” says the researcher.
He is convinced that this new way of operating smartphones greatly increases the range of interactivity. The researcher’s objective is to keep the gestures as simple as possible, so that users can operate their smartphone effortlessly.
But will smartphone users want to adapt to this new style of interaction? Hilliges is confident they will. Gesture control will not replace touchscreen control, but supplement it. “People got used to operating computer games with their movements.” Touchscreens, Hilliges reminds us, also required a very long adjustment period before making a big impact in consumers’ lives. He is therefore certain that this application – or at least parts of it – will find its way onto the market.
SOURCE ETH Zurich
By 33rd Square | Embed |
0 comments:
Post a Comment