Navigating the road to Vision Zero
Descendants of this generation will look through time and view traditional driving as archaic. Similar to the transition from horse and carriages to cars in the 20th Century, the evolution from human-steered vehicles to autonomous cars will be equally disruptive—requiring a holistic ecosystem to drive development and causing a monumental, structural transformation of a large percentage of the global economy. While the automotive industry would like to make a driverless reality come to fruition sooner rather than later, safety remains a major hurdle for the autonomous transportation ecosystem to clear before self-driving cars are predominant on our roads.
Globally, nearly 1.25 million people die in road crashes each year, totaling more than 3000 deaths a day. To make present-day vehicles and the self-driving cars of tomorrow even safer, the autonomous transportation ecosystem, including automotive manufacturers, technology companies, AI (artificial intelligence) providers, policy makers, and academia, must create a solution equal in scale to the challenge. Vision Zero, the goal of no loss of life caused by vehicles, must be the aim if autonomous deployment hopes to reach its full potential.
The role of ADAS and full autonomy
Next-generation sensors gather information from around the vehicle to create an accurate “picture” of the surrounding environment and inform sensor-fusion algorithms and navigation decision making. AI systems and software intelligently combining data from several sensors is key to achieving Vision Zero. Yet it’s important to realize there are barriers along the way. The introduction of new high- performance, cost-effective technologies are needed for mass-market automotive adoption. These include sensors that meet both performance and practicality requirements. Promising technological advancements in light detection and ranging (LiDAR) and radio detection and ranging (radar) have been made, yet these technologies are not a standard fitment in cars today due to cost, reliability, and performance issues. The degree of practicality versus the degree of performance will define when and how autonomy is adopted in the pursuit of Vision Zero.
Technology providers, Tier 1 suppliers, OEMs, and other leaders in automotive autonomy are embracing new business models and making big bets to accelerate the maturation of key autonomous driving technologies. In the pursuit of achieving Vision Zero, the engineering community has been working along two specific development tracks—advanced driver assistance systems (ADAS), which includes Level 1, Level 2, and Level 3 automated driving systems, and fully autonomous vehicles, which includes Level 4 and Level 5 automated driving systems. Though these are two, somewhat parallel, paths to reaching Vision Zero, one point of emphasis remains the same—to advance autonomy, without compromising safety, the focus must be placed on high-performance, highly reliable sensor technology.
Both Level 1 and 2 autonomy primarily reflect a warning system to alert drivers to potential hazards. These systems are inherently error prone and are often switched off due to a high false-alarm rate. Ultrasonic and camera technologies are the primary sensing modalities used in these applications. In recent times, ADAS innovation has primarily focused on strengthening overall perception systems, with particular focus on radar and LiDAR technologies. These additional sensors are required to transition to Level 3 automation, when the vehicle acts in addition to warning the driver. One such application is automatic emergency braking (AEB), which will become a standard option on new vehicles in the early 2020s. Even though ADAS still lacks perfection, these first few generations of systems are whetting the appetite of the public around better performing driver-assistance technologies.
While focusing on Level 2 and Level 3 autonomy, OEMs view the path to full autonomy as a continuum—with ADAS technology feeding autonomy and vice versa. OEMs have created separate companies to develop Level 4 and Level 5 automation, and those with fleets of Level 4 and Level 5 vehicles use these platforms to mature sensor hardware and fusion algorithms—developments that can then be brought back into Level 3 applications as those systems mature.
There are also OEMs that do not have research and development dollars to follow the pursuit of full autonomy, partnering with other companies that specialize in autonomous driving technologies. Due to this segmentation, the understanding of the vehicular environment and the necessary technologies for full autonomy are continuously improved. Still, there remains much more work to be done to improve the performance of specific sensing technologies (radar, LiDAR, and inertial navigation included) and various algorithms that actuate vehicles in specific cases.
Sensing technologies are crucial for attaining higher level vehicle autonomy
Vehicles with more external sensors can become more fully aware of their surroundings, and thus prove safer as a result. Technologies that will be critical in AI systems to create a system capable of navigating for an autonomous vehicle include cameras, LiDAR, radar, microelectromechanical systems (inertial MEMS), ultrasound, and GPS. In addition to supporting the perception and navigation systems in an autonomous vehicle, these sensors can better monitor mechanical conditions including tire pressure, change in weight (e.g., one passenger or six, loaded versus unloaded), in addition to other maintenance factors that might affect motor functions such as braking and handling. To support this step-function increase in capability, industry leaders are working to create an intelligent balance between central and edge deterministic & AI processing for minimum latency and reducing the false alarm rate and overall complexity of the vehicular system.
Radar systems, for example, have traditionally been used for object detection. Next-generation radar systems are becoming more capable of providing sufficient resolution essential for object classification. This dramatic increase in performance is driven by the unique intersection of high-performance, low power deep submicron IC technology and AI-driven comprehension of the complex use cases. These first systems are still a premium feature that will not become widely accessible until the early 2020s, when the automatic emergency braking (AEB) mandate becomes a reality. With an intelligent balance between central and edge processing, the overall latency time is significantly reduced. This gives the vehicle more time to take evasive action. Radar will be the fundamental sensing modality that will drive the next levels of capability in these systems.
LiDAR is not a standard feature in cars today as it is not currently at the right cost or performance point to warrant broader adoption. Nevertheless, LiDAR will provide 10x more image resolution than radar, which is needed to discern even more nuanced scenes (e.g., children vs. adult).
Camera systems are common in new vehicles today and are a cornerstone to Level 2 autonomy. But these systems do not work well under all use cases (e.g., night and inclement weather).
Ultimately, all three of these perception technologies are needed to provide the most comprehensive data set to the systems that are designed to keep the vehicle occupant safe.
Inertial navigation technology is often overlooked in current-generation systems, but will become a cornerstone in the future. In certain use cases when GPS is lost, and perception sensors become unstable (e.g. when entering a tunnel), the fused inertial and map system must keep the car traveling on its intended trajectory as the perception system stabilizes.
Higher degrees of automation are being achieved rapidly because of improvements in these fundamental sensors, like LiDAR. And changes in the automotive landscape, like the AEB mandate, push the state of the art in ADAS forward.
The industry’s hybrid solution of automotive autonomy
Level 3+ autonomy falls in the middle of the two-track system. Though not fully autonomous, it is more advanced than existing ADAS systems. Despite the two tracks taken to reach Vision Zero, Level 3+ autonomy can be viewed as the touchpoint between ADAS and fully autonomous vehicles, as it combines premium performance features with practical functions. Although much higher performance sensors are needed to support Level 3+ applications, such as full-speed highway autopilot and AEB+, the latter when the vehicle not only brakes, but also swerves to avoid an accident. Level 3+ features highly autonomous technologies, including a critical sensor framework that lays the foundation for future fully autonomous vehicles.
Although we are not at the point of full autonomy, Level 3+ automation is a crucial step forward towards achieving the goal of Vision Zero. Level 3+ balances practicality and performance, combining developments from the two tracks of ADAS and fully autonomous vehicles to develop a safe transportation ecosystem. This is the inflection point where autonomous technology becomes much more capable and available to the public.
Regardless of industry leaders’ different approaches toward reaching Vision Zero, sensors are the tool to get there. A diversity of high-performance perception and navigation sensors, which are currently being advanced in Level 3+ autonomy, are foundational to the training and operation of AI systems in next-generation vehicles. Additionally, high-quality data generated from sensors is necessary to ensure that decision-making software makes the correct decision—every time.
The developments along the ADAS and fully autonomous vehicle tracks complement the journey towards making current vehicles safe, and the autonomous cars of the future trustworthy for the public at large. The journey to Vision Zero and full autonomy follow the same road, and it’s important for any player in the ecosystem to keep that front of mind in the years to come. The goal of autonomous vehicle development is not only to usher in a new era of technology and lucrative business models, but also, ultimately, to save lives.
For information on ADI’s Drive360 suite of technologies for highly automated and autonomous driving technology needs, visit https://bit.ly/2Ahi55X.