The writing is on the wall: the traditional car dashboard we know today will not be the dashboard of tomorrow. The next big tech development to invade car cabins, I believe, will be the dashboard that requires only a few simple hand gestures to control. This is welcome news for many drivers who seek personalization, safety, and ease-of-use.

In recent years, carmakers are rolling out technologies that are replacing the knobs and buttons for in-car technology with hand motions. To turn up the heat or change songs, drivers will soon be pointing and swiping, instead of pushing and turning.

And why not? Gesture controls are a logical step for consumers accustomed to tapping and pinching the touchscreens of smartphones. But in addition to that, gesture control in general, and hand-tracking specifically, is a technology that bridges the gap to a more enhanced and safer consumer experience.

As a result, we’re seeing the global automotive gesture recognition system market set to post a CAGR of over 30% during the period 2018-2022, according to market research firm Technavio.

Several factors are fuelling this growth. For one, automakers are incorporating larger screens in vehicles, and this demands dashboards have fewer physical buttons. They’re also working hard to ease the way we interact with our vehicles. And then, of course, the technology itself has improved significantly given the advancements in camera-based technology, computer vision, deep learning, and artificial intelligence.

 

The myriad benefits of “just a swipe”

Many automakers have recognized that gesture control and hand tracking bring benefits in user experience, convenience, and safety.

  • First is safety. Every second we spend searching a touch screen or button cluster and not focusing on the road is a second we are a death risk to ourselves and others. And so, carmakers are striving to find ways that drivers can avoid fiddling with buttons, knobs, switches, or icons on an electronic display screen as a car barrels down the interstate at 70 mph. The ability to wave a hand, rather than having to glance down at the dashboard, is considered a natural next step in the war against driver distraction.

  • It’s also convenient. Gesturing or pointing in the air requires less precision and attention from the driver than tapping a couple of buttons on a dashboard. If gestures are just quick shortcuts to frequent actions, you can really see the safety and convenience benefits.

  • It enhances the driver experiences. With a simple swipe of the fingers you’ll be able to instantly operate the display—an effortless interaction. And hand-tracking will be complemented by voice. We’ll use voice for single commands and hand motions for continuous action. For example, it’s easier to adjust the volume using hand gestures than to keep commanding the system to turn the volume higher or lower again and again.

Automakers including Audi, BMW, Cadillac, and Toyota are all in the process of implementing some form of gesture technology into their automobiles. Over the next few years, there will be a wave of innovation in this space. One of the most exciting things is how these high-tech systems or future automotive cockpits will be designed.

We’ve been working with major automakers and global automotive suppliers worldwide to include hand-tracking capabilities into their designs. One global automaker headquartered in China has been testing uSens 3D skeleton hand tracking and gesture control technology for use in their cockpit to control audio, communications, and more.

 

The road ahead

While the benefits are numerous, bringing hand-tracking and gesture-control products to the mass market requires tackling several issues. This will necessitate a unified effort among the technology supplier ecosystem: sensor and camera manufacturers, processor companies, algorithm providers, and application developers.

  • Standardization: In a market as young as this one, there is still little to no standardization across the ecosystem. Multiple camera technologies are used to generate 3D data, and each technique produces its own characteristic artifacts. Gesture dictionaries are not standardized; a motion that may mean one thing on one system may mean something completely different (or alternatively nothing at all) for a different system. Standardization is necessary—and inevitable—for the industry to grow and otherwise mature.

  • Costs: The costs are currently high for these hand-tracking and gesture-control solutions due to the depth cameras being used today. However, with the advancement of deep-learning software, these sensors are being exchanged for infrared camera sensors, a much less expensive alternative. These cameras can be embedded into the dashboard, ceiling, or behind the steering wheel, and there may be multiple cameras inside the cabin as well. While camera technologies may change, it’s the quality of hand tracking and 3D motion recognition software that is key to high performance.

So, lower costs (while still offering high performance) and standardization are what’s needed to see mass-market adoption of gesture-control and hand-tracking systems. When this happens, gesture control will expand into other capacities such as body and facial recognition/tracking as well as applications such as advanced driver-assistance systems and even anti-theft and identification. Just imagine the car alerting you that you’ve dozed off, or your child fell asleep, or you left your phone behind—all based on computer vision.

The in-car experience is evolving, and riding in a car in the 2020s will go an awfully long way toward helping drivers keep their eyes on the road in the first place. Hope you weren't too in love with those buttons!