Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Wider application possibilities for gesture control

Rob Lane looks at the possible applications for gesture control and whether it is scalable to larger devices.

Gesture control could have multiple applications and be scalable to larger devices, says Rob Lane.

The news that Elliptic Labs has developed bat-like ultrasonic technology that provides touchless gesture control/recognition for smartphones and tablets has given gesture control tech watchers a boost – and would get Batman’s approval! Exhibited at CEATEC Japan in October, it is intended to help mobile device manufacturers to create more intuitive navigation experiences and faster interactions to them.

Ultrasound is beamed from the device’s transmitter speakers on to the user’s hand and back to integrated microphones, allowing the technology to recognise any movement – up to 180º. Perhaps the ultimate in lazy smartphone/tablet usage, it offers an exciting alternative to existing gesture touch technologies, especially if it’s scalable to larger devices.

The 180º field of view is certainly an advantage, as is ‘distributed sensing’, which enables motion capture of a user’s hand from multiple angles, so avoiding occlusion of objects. In addition, ‘range gating’ separates the ‘first returning’ ultrasonic echoes from those arriving later – so Elliptic’s tech can separate foreground from background, preventing accidental gestures from being recognised.

I often mention movies in this column. With gesture interaction, Minority Report is always the movie cited by the wider press, and it’s interesting that just over a year ago Bristol University’s ‘ultrahaptics’ was reported as being reminiscent to the gloves worn by Cruise – particularly as ultrahaptics uses tiny speakers called ‘ultrasonic transducers’ to sense movement, a la Elliptic.

In this case, waves of ultrasound are projected through the display, displacing the air to create acoustic radiation pressure, which the display reacts to. The big difference is that these ultrasound waves are said to create vibrations on the user’s skin, which means that in theory users could sense different levels of vibration pressure ‘textures’, allowing for tactile mid-air touch distinctions – so allowing users to ‘feel’ each navigation control on the device, like Spider-Man when his Spidey senses are tingling.

In this case, additional technology is needed to enable control of what’s on the screen – something like a Leap Motion sensor or Microsoft Kinect – as the ultrasound does not communicate back to the source display.

Another new gesture control tech, also initially aimed at smaller computing devices but with scope to interact with pretty much anything, is the Thalmic Labs Myo armband. Designed to fit around users’ forearms, the start-up’s technology detects small muscle movements, rotations of the arm and electrical muscle impulses using the information to control a device’s functionality.

Unlike Kinect, this is a mobile solution and it has potential medical applications, perhaps allowing surgeons to interact with devices during operations without touching them.

Thalmic has integrated Myo with smart glass wear tech including Google Glass, Epson Moverio and Oculus Rift. Video demos show users flicking wrists to move things across the screen and rotating arms to adjust volume – exactly as you’d imagine it working. Pre-order cost is $150 and it comes with the 10-foot Experience app, allowing users to control presentations from a distance between 10ft and 30ft. This is mainly an opportunity to jump on board with the evolution of a product that’s still very much in its infancy.

Back to Minority Report’s mittens: Fujitsu’s recently developed ‘glove’ device uses a near field communication (NFC) tag reader for ‘on-site workplace use’. The device has a contact sensor on the finger and the NFC reader allows users to touch tags of objects to digitally identify them. The device can also recognise gestures due to the inclusion of a gyro sensor and accelerometer in the wrist, allowing for basic up, down, left, right, rotate-left, rotate-right movements.

Fujitsu is aiming the device at industries where the working environment requires gloves or gets workers’ hands dirty, so making it tricky to use mobile devices. The gesture control cleverly kick-starts when the user’s wrist is bent back in a movement reminiscent of Spider-Man (again!) when he’s squirting his webbing.

Reading this back, it’s clear that gesture control start-ups need to keep watching the movies for inspiration – particularly Minority Report and the comic book adaptations!