Advanced driver-assistance systems (ADASs) are getting a lot of attention lately—and for good reason. There have been notable advancements in the automation of tasks within a vehicle. Thanks to artificial intelligence (AI), especially image-processing applications, ADAS is being touted as a major step forward for both semi and fully automated vehicles. The benefits could go well beyond simply increasing the overall awareness of car drivers.

Government mandates have certainly helped. Both the United States and European Union have codified the use of the original ADAS applications: automatic emergency braking and forward-collision warning. All cars must be equipped with these features by the year 2020. However, new ADAS features, influenced by AI, could very well become the key differentiator in the AV market and an important revenue source.

“With the rapid development and deployment of various advanced driver-assistance systems packages by OEMs, higher level automation represents the next suitable step,” said Shiv Patel, Research Analyst at ABI Research. “The primary functional sensor gap between today’s ADAS and higher level autonomous vehicles will be filled with the addition of LiDAR, which will help to provide reliable obstacle detection and simultaneous location and mapping (SLAM).”

It is entirely possible future ADAS offerings could influence the ultimate creation of a fully autonomous vehicle. It is no wonder why ADAS is currently the focus of research and development projects run by OEMs and the major auto tech players. The possibilities for ADAS are endless due to the integration of the technologies: processors, sensors, mapping, and software algorithms.

For autonomous vehicle systems, image processing is where software algorithms can still move the needle. Algorithms are developed to collect and synthesize inputs coming from various sensors, monitoring the surrounding road environment in real time. Algorithms are also developed to accurately predict all possible human behavior. This is no small feat because, as we all know, humans are sometimes irrational and, generally, unpredictable. This is why autonomous vehicles when fully automated could require some of the most complex in-car software integration ever created.

The two main purposes now served by ADAS algorithms are to provide output to the driver. This happens primarily in the form of alerts. The other purpose relates to how ADAS reacts to and manipulates the vehicle in navigation (braking, steering, or other safety-related commands). But in the future, algorithms bolstered by machine learning will be able to facilitate many other tasks associated with autonomous vehicles such as:

  • Detection and classification of any road event with the highest accuracy in minimal time. AI can help the driver avoid obstacles and dangers, adjust to traffic signs, and avoid pedestrians. The operation can be both passive and active. Passive means dedicated alarms and active means taking appropriate actions. Examples being lane-departure warning (passive) and lane-departure centering using automatic steering (active).
  • AI can also be used to monitor driver attention and awareness by detecting early signs of fatigue, dizziness, and driver distraction, again resulting in passive or active actions. It would be possible that a car's driving profile can be adapted to a “momentary driver” status. The actions that can result include a driver alert and speed limitation up to full stop, when driving conditions requires it.
  • Different environmental conditions make some sensors more effective at particular moments. Trained AI neural networks will automatically adjust to the most useful sensor subset at that moment. This sensor “fusion” then provides the best driving decisions possible, given all the information acquired from all mounted sensors on the vehicle.
  • AI neural networks called CNNs (Convolutional Neural Networks) are also one of the key techniques used in “deep learning.” They can easily classify conditions that were too complex to solve with classical computer-vision methods. In the past, road events like approaching a crossing junction was a complex mission that involved too many rules for classical computing techniques. These are now commonly solved using the training of neural networks.

In the meantime, ADAS will continue to be a bridge to the reality of autonomous vehicles. Artificial intelligence will continue to facilitate AVs taking on more tasks. The end result should be new levels of safety for drivers, passengers, and all others who share the road. The National Highway Traffic Safety Administration (NHTSA) says that 94% of all U.S. traffic accidents are caused by human error—which includes errors of recognition (the largest percentage), decision, performance, and non-performance.

AI and machine learning will play a monumental role in eliminating many of these human mistakes. Many people already believe ADAS has had a positive effect in reducing highway fatalities.

“Advanced driver assistance systems increase safety and have already prevented many accidents,” said Frank Jourdan, Board Member for Continental. “Emergency brake assistance systems are invaluable when it comes to avoiding rear-end collisions.”

The NHTSA reported that between 2018 and 2017, the highway fatality rate dropped by 1.8%. However, because humans are humans, several issues remain. Those being distracted, reckless, and impaired driving. In addition, humans will still need to come to grips with the idea that machines, eventually, will do the driving for them. What ADAS accomplishes right now tells us that technology is up to the task of helping us make good driving decisions.