BC Explains: Gesture control
03 July 2018
Author: Rachel Boagey
In our third explainer feature, Jack Carfrae gives a nod to gesture control.
Ever had a go with a Nintendo Wii? If not, it's the immensely popular games console that has a stick with motion sensors instead of a control pad, and encourages players to flail their limbs wildly in front of the TV screen.
That hasn't got much to do with cars, but it's arguably the most famous example of gesture control in action. At their core, gesture control or recognition systems are designed to identify certain human actions or body language - a wave of a hand, a nod of the head, a twirl of a finger - and interpret their meaning to perform a specific function. Cameras or motion sensors are typically employed to capture the initial human movements, while carefully calibrated algorithms are responsible for telling the system what the person wants it to do.
The technology began life in the 1980s when, in its primitive form, it was paired with voice recognition and (then) high-tech gloves to interact with projections on a screen. More developed versions have subsequently been applied to understand sign language, to move cursors on computers and to unlock smartphone screens, among plenty of other applications.
Gesture control can be found in the automotive industry today, more often than not within infotainment systems. One of the most common examples is Volkswagen's Discover Pro system, which is available with the current Golf. It allows the operator to control certain functions by waving their hand left or right in front of the screen, which means you can, for example, change the radio station or flick through menus without physically touching anything.
Other than being rather smart, the selling point for drivers and fleet operators is the lack of driver distraction. Put simply, you don't have to take your eyes off the road to wave or wiggle your fingers, but you do if you're prodding around for a specific switch.
Though the technology is capable of recognising movements from numerous parts of the human body, at present, vehicular gesture control systems are typically limited to hand signals. The Depthsense Carlib system, developed by Sony Depthsensing Solutions and found in the current BMW 5 and 7 Series, does exactly that. It can control elements such as the audio volume, the navigation system and incoming phone calls (you can answer or reject them) when the driver points, swipes or makes circular motions with their hands within the system's field of vision. Robert Vermeer, the company's senior product manager for automotive, explains how it works, "We're using an optical image sensor - actually a 3D sensor - to locate the position of a hand and a finger, and then we use computer vision processing to analyse the image and to classify certain gestures of the hand after the image has been accepted. In a nutshell, the user is able to get control of certain parts of the vehicle by using hand gestures. In our case, it's the infotainment control."
In order to successfully replace conventional buttons, the systems need to execute the desired task and also let the driver know they've done it. It's obvious enough in some cases, such as when you twirl your index finger and the stereo cranks up, but certain systems emit an audible acknowledgement for particular functions, to make up for the lack of physical sensation and feedback of pushing a switch.
"Our system does have audible feedback; we're taking the feedback system outside the loop of the actual gesture system," says Vermeer, "However, if you're moving your hand in a certain position to control the volume, you are getting feedback from the volume system of the car. In the case where you're accepting or rejecting calls using the hand gesture, then of course,
you have audible feedback from the vehicle anyway."
As far as current production cars are concerned, cockpit-based functions linked to the infotainment system are where you're most likely to find gesture control, but the technology has already been applied to other areas and is due to expand.
BMW, again, has put it to work in an exterior application. The i3 plug-in city car features an automatic parking function, known as Parking Assistant, which independently tucks the car into an adjacent space with little more than a prod of a button. That is neither an example of gesture control nor a breakthrough, as plenty of modern cars have similar abilities, but the manufacturer debuted a particularly tricky feature at the 2016 Consumer Electronics Show, which combined the two in the form of a smartwatch. When it's strapped to the driver's wrist, it's possible to park the car by waving your hand in a certain direction while you're standing outside it. The idea is that those dealing with a snug garage or tight space can exit the vehicle, wave at it, and leave it to tuck itself into the desired spot.
Back in the interior, work is under way to advance the relevance of gesture control and increase the number and types of hand signals it can recognise. In January 2018, electronics giant LG announced a partnership with Israeli gesture recognition specialist Eyesight to develop motion identification. In addition to relatively simple hand movements, such as waving or wagging a finger, the companies claim their efforts will allow drivers to make an 'OK' sign (index finger and thumb pressed together) to answer an incoming phone call, and raise two fingers to toggle between the navigation screen and the main menu. The theory is that, by introducing everyday gestures, drivers will find the technology less baffling as it's rolled out across a greater number of vehicles.
ZF is even further down the futuristic rabbit hole. In December 2017, the automotive technology behemoth revealed a concept steering wheel, complete with gesture control. Designed for Level 3 autonomous vehicles (those that will eventually allow drivers to take their eyes off the road and perform other tasks, while still having to remain at the helm), the wheel is octagonal and has an iPad-style touch screen in its centre. It's a bit pie in the sky, but the company says functions such as the indicators, infotainment system, climate control and the horn could be controlled by a combination of tapping the various edges of the wheel and performing swiping and waving motions, depending on how the manufacturer wanted to set it up.
Manufacturers' desires also play a part in the adoption of gesture control and other technologies that might replace conventional switches. As well as removing driver distraction and appearing cutting edge, there is a long-term economical argument for doing away with old-fashioned buttons.
"Auto designers want to get rid of this whole switchgear thing, because it means they can have much more freedom over the shape and style of the panels that people interact with," says Thomas Arundel, technical marketing director at electric motor specialist Precision Microdrives.
"Some cars have stalks that you can use to control the audio system and a lot of buttons now appear on steering wheels. If you imagine a switch might cost a dollar, and you've got six switches on a panel, that's $6. If you could replace that panel with a more modern equivalent that costs $4, you save yourself $2 on that panel.
"It only really works for panels that have more than a handful of switches, but a window opening panel, for example, or an air conditioning panel - there might be a cluster of eight switches on there - or maybe the lights above your head. So the motivation is cost, but also there's a lot of flexibility on design."
"We see the next-generation cockpit as fairly clean with very few physical buttons," adds Sony's Vermeer. "The next-generation gesture control could replace the buttons for anything. It could be for opening the glove box."
As the technology progresses, it is expected to capture movements from different areas of the body and not just from the driver. "We are working on wider field of view cameras, which can look at the driver's head position. Depending on where the camera is looking and how much view we have, we can look at a particular region of interest, like the head, and we can extract a feature such as your pitch and yaw angle, so it's not limited to the hand.
"Those gesture sets will be expanded and, let's say, experimented with by the OEMs, and they'll be customisable. So you may have a standard set of gestures or you may even have some specific gestures, which an OEM likes to differentiate its vehicle, and we believe that technology will evolve so that every occupant of the cabin will be able to control certain functions by gesture."