New smartphone app understands user commands using hand gestures

A new app for smart phones (smartphones) allows the user to manage their smartphone with hand gestures. This development extends the range of potential interactions with such devices.

A new app for smart phones (smartphones) allows the user to manage their smartphone with hand gestures. This development extends the range of potential interactions with such devices.

The program uses the built-in smartphone camera to record the environment. Not assess the depth or color. The information recorded (in the form of a gesture, and parts of the hand) is reduced to a simple outline that is classified as gestures previously stored.

Not Credited

At first it seems slightly strange: We hold the phone in one hand and move the other into the air on its built-in camera, gesturing that could remember a little sign language. Sometimes we move our finger to the left, sometimes to the right. We can extend our fingers, or imitate the movement of the “jaws” of pliers, or the firing of a gun. These gestures are not intended, however, to communicate with deaf people; are to control your smartphone.

Mimicking the firing of a gun, for example, a user can change the map view, satellite option to normal, or shoot down enemy aircraft in a videogame. Extend the fingers expands a section of a map or lets turn the page of a book forward.

Singular control through gestures, some amusingly similar to those magicians do many supernatural-themed films, is made possible by a new type of computer algorithm developed by Jie Song, ETH (Swiss Federal Institute of Technology in Zurich, also known as ETH Zurich).

The program uses the built-in smartphone camera to record the environment. Not assess the depth or color. The information recorded (in the form of a gesture, and parts of the hand) is reduced to a simple outline that is classified as gestures previously stored. The program executes the command associated with the gesture observed. It also recognizes the hand away from the camera and alerts the user when the hand is too close or too far.

Numerous motion recognition programs need to consume much processor power and memory, and this is perhaps where the most outstanding application created by Song and colleagues. Your new algorithm uses a much smaller portion of computer memory and is therefore ideal for smartphones. Modest does this pioneering application of computational resources implies that consumer smartphones can also be implemented in smartwatches or augmented reality glasses.

The gesture control will not replace the control by touch screen smartphones, and it is the intention of the creators of the application, but to complement it, increasing the capacity of user control and making it easier and more convenient in some activities.

This archive content was originally published November 4, 2014 (www.betawired.com)