The Future of Automotive HMI: Designing Haptic Icons
As the experience of riding in a car converges with consumer electronics, user experience is under the spotlight. Charlie Alexander, Director of Automotive Products, talks about Ultraleap and the University of Nottingham’s collaborative research into a user-centred, tactile language for future automotive HMI: the Hapticons Project.
The inexorable rise of software as a key driver of innovation and differentiation is changing the automotive industry from top to bottom. Volkswagen’s new Project Trinity sales model, which makes the experience of owning a car much closer to that of owning a smartphone, is the latest development in a fast-moving and disruptive wave of change.
The industry is marching towards connected, autonomous, shared, and electric (CASE) products. And in the future of automotive HMI, this puts user experience in focus like never before.
On the one hand, high performance, centralised compute systems can deliver user experiences similar to those found in consumer electronics.
On the other, autonomous vehicles are eating away at traditional differentiators such as acceleration, braking, torque, and top speed. With these on the retreat, cabin experiences are becoming the new battleground in automotive.
More and more, brands are differentiating themselves by what you can do during a journey, as much as how you can drive. And, of course, premium brands need to deliver premium experiences.
The increasingly commonplace metaphor of cars as “smartphones on wheels” is useful in some ways. However, in the future of automotive HMI we can actually expect cabin experiences to be far beyond what smartphones offer.
In a connected car, you’re not holding your device in your hand. You’re sat inside it. It means that the entire three-dimensional space inside the car becomes the canvas for a user interface.
This creates the opportunity for automotive HMI to deliver deeply immersive experiences. Future cabins will add holographic displays to touchscreens and fuse voice, eye tracking, gesture control, haptics, and physical controls to create multi-sensory, user-centred automotive HMI.
Exploring the future of automotive HMI: The Hapticons Project
As part of exploring the immersive future of automotive HMI, Ultraleap and the University of Nottingham have been investigating new ways haptics can be used as part of a user-centric HMI: the Hapticons Project.
Hapticons (haptic icons) are tactile symbols projected onto a driver or passenger’s hand using Ultraleap’s mid-air haptic technology. They’re the tactile equivalents of, say, the trash can icon on your computer desktop. We envisage them as being bound to specific gestures in a gesture control automotive HMI system.
So, for example, when you make a gesture to turn down the temperature, you receive specific, identifiable haptic feedback on your hand. This indicates not only that your command has been accepted, but that the right function has been activated (temperature, not volume control or navigation).
We believe that in automotive HMI, immediate, intuitive haptic feedback:
- Reduces driver distraction: A 2019 study by the University of Nottingham showed that adding mid-air haptics to gesture control reduces driver distraction and cognitive load.
- Improves user experience: The same study showed that drivers strongly prefer gesture control when mid-air haptic feedback is added.
- Increases the perception of a premium cabin: Touch is up to three times more important than either vision or hearing in contributing to quality ratings of in-vehicle controls.
While gesture control systems are already commercially available (for example, in the BMW 5- and7-series, Jaguar XF Sportbrake, Mercedes-Benz S-Class, and 2019 VW Golf), a lack of corresponding tactile feedback remains the missing piece.
Using user-centred design to develop a haptic communication system for automotive HMI
There are many examples (not least, the existence of Braille) showing it is possible to use the sense of touch to communicate abstract symbols. However, no one has ever tried to use this innate human capability to create a communication system for automotive HMI before.
With no prior examples to draw on, a user-centred design approach was clearly essential.
The project followed four stages:
- Participatory design. We elicited metaphors for or associations with key infotainment features. What did participants associate with an incoming call, for example? Then, we uncovered common themes, enabling us to simplify and abstract the concepts to make them suitable for transforming into abstracted tactile sensations. Based on this, we shortlisted 28 concept icons.
- Expert appraisal: The concept icons were reviewed by our haptic engineers and researchers. Criteria included recognisability, appropriateness, ease of learning, and feasibility. This reduced the shortlist to 17.
- Prototyping: The 17 shortlisted icons were prototyped using our latest haptic system, which delivers stronger mid-air tactile sensations.
- Salience validation study: 25 participants validated the shortlisted prototypes based on four criteria: Instant Identification (IID), recognisability, distinguishability, and Instant Articulatory Directness (i.e. how obvious the metaphor is without any cues)
- Selection of seven icons: Based on a detailed analysis of the salience validation study, we selected seven icons to move forward with.
The first two stages of the project were published in late 2020 as part of the proceedings of the Automotive UI 2020 conference.
Are hapticons usable in future automotive HMI?
Further research is needed. However, from the data collected in our salience study, we predict a very high recognition rate for the selected haptic icons when integrated into a fully-fledged in-vehicle infotainment system (IVIS).
Our user-centred design process ensured the metaphors conveyed by the hapticons are as universal as possible.
The chosen sensations are also those that map tactile sensation onto abstract meaning the most clearly. As such, they offer the most promise for improving learning and recall of a combined haptic sensation and gesture set.
The next stage is to integrate the seven chosen hapticons into our latest automotive prototype. In collaboration with the University of Nottingham, the hapticons will undergo an automotive industry standard HMI assessment in a driving simulator. They will then be integrated into an instrumented car for an on-road study.
The sense of touch is the least researched and least understood of all of our senses. Enabling smart cars to communicate and build relationships with drivers and passengers via the sense of touch is a language we’re still writing the dictionary for.
Yet with the exploding ability and need to create differentiation via automotive HMI and deliver more flexible, intuitive and intelligent interface solutions, it’s a language that has never created more opportunity for the automotive industry.
Charlie Alexander is Director of Automotive Products at Ultraleap.
Explore our blogs, whitepapers and case studies to find out more.