The vision of change in how we drive
When the first Model T rolled off the Ford assembly lines, the common drivers of those days were not all that concerned about the “driving experience.” They were probably more amazed that machines could replace horses as the primary way for people to get from one point to the other. But as cars advanced, so did designers’ attention to the safety and comfort of drivers and passengers. Except for addition of the radio and the ability to heat and cool the cabin, the way people interact with their vehicles has been fairly standard. Sure, new model years mean new gadgets being introduced year after year, but the driving experience inside an automobile hasn’t changed all that much in the last several decades.
But that is about to change in a big way. The powerful combination of cameras, sensors, computer vision, and machine learning will soon deliver astonishing differences in how drivers and passengers will experience their vehicles, road conditions, and surrounding environment.
No more bumpy rides
Soon you’ll see a proliferation of cameras and sensors that will be monitoring, detecting, and analyzing road conditions in front of the car. Detecting an upcoming bump will automatically change the suspension of the car, making it softer or harder, lower or higher, depending on road surface quality. Artificial intelligence (AI) will allow road conditions to be quickly processed, allowing the system to change how the car positions itself on the driving surface. The system will work with each wheel independently to mitigate holes or bumps. The result will be the very definition of a smooth ride. Road irregularities will be filtered out, which improves comfort, handling, and safety.
In addition, cameras will show on a display what’s underneath the car and a few feet forward. In effect, this will create a “transparent” floor that will give the driver the ability to drive safely around obstacles or rough terrain. No more guessing about tricky parking spaces, curbs, or rutted tracks.
We are in the early stages of driver monitoring capabilities being introduced that detect a driver’s head pitch, angle, eye closure, cellphone use, etc. Machine learning uses this information to subtly control various car systems. In the future, as we move closer to Level 4 and 5 autonomy, monitoring sensors will become attuned to the emotional state of the driver. This will result in a personalized experience, changing the car’s atmosphere: lighting, music, and dashboard display coloring.
A clearer view of everything
In the future, the windshield will become a sophisticated hologram-like display screen incorporating augmented reality (AR) to facilitate some driving maneuvers while also providing real-time information about places where drivers might want to stop.
For example, if a driver needs to select a lane, he/she will see an arrow on the road, an actual virtual arrow, being displayed on the windshield, indicating where to go. If looking for a refuel, besides navigating to a gas station, on the windshield, a driver might see the gas station’s logo and the current prices of gas. It will be the same for many other destination options like coffee shops or banks. Everything will be displayed onto the windshield and become what occupants see from the car.
Real-time information is continuously fused from different data sources and the windshield view is adjusted based on a driver’s head position and eyes. Based on those positions, the augmented information will continuously “float” so that it’s viewable but not blocking the road view. This information and “view” can be extended to the passenger windows in the same fashion, so other people in the car will get the information on what’s being seen.
The rear-view mirror will also become a sophisticated display providing a view of conditions behind the car based on where the car is being driven and surrounding conditions and what the driver is attempting to accomplish.
Driving on a busy expressway, the car will understand this and not present anything above and beyond what a rear-view mirror generally provides. But backing up, for example, your rear-view display will be an amalgamation of what the rear-view cameras and sensors are “seeing,” providing a much wider range of view. This will eventually be a 3D display where a driver can view the scene from a different angle as if outside the car to easily pass tight obstructions or fit into tight parking spaces.
Dim the lights
Finally, intelligent headlights will eliminate the problem caused by oncoming cars with high-beam headlights. A computer with a camera will identify cars driving in the opposite direction and automatically shut down specific LED lights in the front matrix, eliminating the section of light blocking the driver’s view and retaining a safe light level. In the other direction, the lights will still work and will allow the driver to see the rest of the road clearly. Because the relative positions of oncoming cars is constantly changing, this process will work continuously.
Over the next five years, regardless of how quickly autonomous vehicles are introduced to the market, we are going to witness a very different driving experience from what most of us have been used to since we started driving. It’s going to be a combination of a smoother, safer, more personalized, and more entertaining drive. And it’s all driven by the same technology, computer vision, and deep learning technologies that are currently changing so many other industries.