Ultrasound chip offers gesture control for mobiles

Ultrasound
technology that enables mobiles and tablets to be controlled by gesture
could go into production as early as next year.
Norwegian start-up Elliptic Labs is in talks with Asian handset manufacturers to get the chip embedded in devices.
The technology works via an ultrasound chip that uses sound waves to interpret hand movements.
The move towards gesture control has gathered pace and there are now many such products on the market.
Big gestures
What sets Elliptic's gesture-control system apart from others
is its wide field of use, up to a metre away from the phone. It means it
can identify mid-air gestures accurately.
Because it uses sound rather than sight, the sensor can
recognise gestures from a 180-degree field. It also consumes less power
and works in the dark.
By contrast Samsung's Galaxy S4 uses an infrared sensor that can only interpret hand movements within a very small zone.
"The user needs to learn the exact spot to gesture to instead
of having a large interactive space around the device," said Erik
Forsstrom, the user interface designer for Elliptic Labs.
Allowing users more freedom in how they gesture is vital if such products are to become mainstream, he thinks.
"With a small screen such as a phone or a tablet, the normal
body language is not that precise. You need a large zone in which to
gesture."
If consumers can quickly see the effects their gestures have
on screen, he thinks, "it is quite likely that this is the next step
within mobile".
The technology was recently shown off at Japanese tech show Ceatec.
In the demonstration, an Android smartphone was housed in a case containing the ultrasound transmitters.
But Elliptic Labs said it had formed partnerships with a
number of Asian handset manufacturers who are looking at building the
ultrasound chip into devices, as early as next year.
Mass market
Increasingly firms are experimenting with gesture control.
PrimeSense, the company that developed gesture control for
Microsoft's Kinect console, has also made strides towards bringing the
technology to mobile.
By shrinking down the sensor used in the Kinect, the firm
showed it working with a Nexus 10 at a Google developers' conference in
May.
Meanwhile Disney is testing technology that allows users to "feel" the texture of objects on a flat touchscreen.
The technique involves sending tiny vibrations through the
display that let people "feel" the shallow bumps, ridges and edges of an
object.
Ben Wood, analyst with research firm CCS Insight thinks such devices could be ready for the mass market.
"Apple's success has made gestures a part of everyday life.
Now consumers understand they can manipulate a screen with a gesture or a
swipe everyone is racing to find innovative ways to exploit this
behaviour.
"Ultrasonic is particularly interesting as you don't need to touch the screen which can be an almost magical experience.
"It is ideal if you have dirty or sweaty hands. A common
example people use is flicking through a recipe when cooking. Other
examples include transitioning through a slideshow of photos or flicking
through music tracks or turning the page on an ebook," he said.
"The big challenge that remains is how you make users aware of the capability."