User experience, interaction, and engagement: a complex trifecta
To keep up with this rapid pace of innovation outside the industry and foster consumer trust, automakers must ensure that they’re providing the best user interfaces that create enhanced in-vehicle experiences.
User interfaces (UIs) can help drivers and riders better understand the handover of control in a Level 3 or 4 autonomous car, allowing them to feel safer and more comfortable—and fully trust the car they’re in. The main challenges for future UX (user experience) concepts are the acceptance of automated driving technology by the driver, coupled with the complex handover process from human to automated driver.
Stefan Ilijevic, Manager of Pre-development Innovation and Patents of Spanish automaker Seat, told AVT one of the main challenge is related to the increase of ADAS systems onboard.
“Some of them interact with the others, and the related haptics, acoustics, and visual warnings have to be shown at the same time without disturbing the driver experience,” he said.
Ilijevic explained the Seat technical center develops UX/UI designs based on Volkswagen Group concepts, and then adds some kind of emotional message, which will be available in the company’s next Leon vehicle and future developments.
“We already develop complex software codes to define the priorities when an event in the car occurs and we test them in the real world, but also run various tests for a long time before the car achieves the market,” he said.
For example, Ilijevic noted that safety-relevant messages have to be shown at the right moment, and the non-critical operations can’t disturb this process—something that haptic feedback could help with in the future.
“The industry knows that by adding more tech to the vehicle they’re able to be helpful but also may be becoming more of a nuisance; you don’t want to have an adverse effect,” explained Matt Arcaro, Research Manager for Next Generation Automotive at IDC.
He says the challenge is to combine a multitude of complex systems and bring them to a layman driver that just wants to get from Point A to Point B.
“How do you get them to understand what each system does and make sure their cognitive focus is on driving?” he asked. “From the UI/UX perspective, there are a lot of challenges—do you go heavy handed in terms of the feedback mechanisms or take a more subtle approach so users can tailor alerts to their user preferences?”
Arcaro said that voice recognition systems could play a vital role in the future because drivers can interact with the platform without taking their eyes off the road, and it also allows OEMs to take a hybrid approach by integrating with concierge services provided by third parties.
Providing an engaging experience
Head-up displays, which could be combined with AR (augmented reality) and AI (artificial intelligence) components, could be another avenue available to automakers, however the onus is on OEMs to come up with an engaging and attractive on-brand experience.
“You want it to be not only functional, but also be beautiful—we definitely don’t want an 8-bit experience,” Arcaro said. “This is a very difficult concept, and up until now not a particular design strength among automakers.”
In addition to the vehicle’s ADAS (advanced driver-assistance system) interacting with the driver, the infotainment, and other components—many designed and provided by Tier 1 suppliers and other players in the component ecosystem—will have to ensure smooth transitions between engagement and disengagement.
Arcaro said drivers will experience the greatest impact not from driving features and infotainment systems that are innovative on their own, but when each and every technology—from sensors and audio to intelligent assistants—work cohesively together.
This could extend to AI and ML (machine learning) software embedded in future vehicles, he noted.
“If the vehicle has the capability to provide a level of autonomy and knows what the state of the driver is, mentally or physically, it could provide support beyond the usual—essentially taking over by using driver monitoring through an in-cabin camera,” he said. “That’s a pathway into what the driver is thinking and doing, and that opens up a new Pandora’s Box of trying to figure out what that driver is doing.”
From device-centric to experience-centric
Jason Johnson, Director of User Experience Design and Huemen Studio Lead with Harman, said that as more and more vehicles become autonomous, the industry is increasingly turning its attention from device-centric to experience-centric philosophies.
“Enabling consumer trust in the fully autonomous vehicles of tomorrow is an incremental process, and it starts by building incredible connected, technology-enabled vehicle experiences today,” he said. “Trust is something that is earned over time and requires consumers to be an active part of that journey.”
Driver experiences will increasingly incorporate intelligent technologies like AI, which will be able to enhance personalization, as systems like in-car mood state detection learn behaviors and adapt to each user’s personality and preferences.
“The development of ADAS systems also requires collaboration from OEMs, suppliers and technology companies like Harman, regulatory agencies, and media partners,” Johnson said. “Without this collaboration the industry risks operating in silos, which will slow down progress and deeply impact consumers by providing segmented driver experiences that don’t talk to each other.”
He predicts the industry is going to prioritize UX factors like augmented reality, supporting the driver and passengers with ADAS features, alerts and information, and voice assistants, enabling bi-directional natural language communication to ease operations.
“The age of self-driving cars has clearly demonstrated a need to raise UX to a new level,” he said. “However, until Level 5 autonomous cars are on the road, the industry must still tackle the problems of Level 3 and 4.”
Starting from zero
Ilijevic said the user experience in future Level 3 and Level 4 vehicles will be completely different and must be defined again, starting from zero, but taking into account previous experiences.
“Voice commands will play an important role in the future, for sure, and new developments based in artificial intelligece will bring huge improvements in the driver experience,” he said.
Once the car starts to drive in a semi-autonomous mode, drivers will focus their attention mainly on the entertainment system. And as the vehicle starts to interact more with its outside environment, it will be able to communicate that information with the vehicle’s occupants.
This could also provide more information to the driver on how the vehicle is driving and help build a driver’s trust in the car by knowing it can get around safely.
“Some if the information on the dashboard will be relevant only for some technological users,” he said. “We already develop UX/UI solutions with different views, but it’s time to go further.”