Augmented reality head-up displays: HMI impacts of different field-of-views on user experience
Horizon cockpit Credit photo Visteon
Head-up display (HUD) technology in aviation has been known about since the 1940s. It keeps the pilot’s attention focused outside of the cockpit and supports aircraft control by information visualization within the pilot’s main sight line on a transparent combiner. What has been worked out for pilots, also worked for drivers. “The Oldsmobile 1988 Cutlass Supreme Indianapolis Pace Car and limited edition replica cars were the world’s first production automobiles equipped with a HUD” (Source: GM). In 2001, GM featured the first color windshield HUD (W-HUD) with the Corvette.
With the introduction of less expensive combiner HUDs (C-HUD), as an alternative to W-HUD technology in 2005 by PSA, one can observe a rising market demand and penetration of both HUD technologies up to this date. Whereas a W-HUD uses the windshield for picture reflection, the C-HUD utilizes a transparent screen that works as a reflection plane in front of the windshield. The value of both technologies are subjectively and objectively measurable – high subjective acceptance, reduction of driver distraction due to less eye accommodation, eye adaption, information pick up time, and sight deviations, amongst others (Milicic, 2009). Depending on optics, technology, and available package, simple C-HUDs can present information up to 2m in front of the driver’s line of sight. W-HUDs are also able to present information in a typical distance of 2.5m.In future, automakers will offer augmented reality HUDs (AR-HUD), pushing the limit for their virtual image position above 7m.
Following these technology trends and our customers’ demands, Visteon’s advanced technology group is additionally developing this innovative technology. This paper will focus on the properties that influence drivers’ HUD experience. From the end-user’s perspective a general question is: What is the benefit of ARHUD technology compared with conventional HUD solutions? From designers’ perspective, what could be the requirements in designing HMI for cars with ARHUD technology? Next to the virtual image position, there are a number of additional aspects to consider such as differences in location of the image within the size of the field-ofview (FoV), human factor requirements, and the design of the HMI, which means to define what information is shown where, when and how. The C-HUD image position can be perceived within the combiner and positioned on the top of the instrument panel. Since the combiner is transparent, the picture White Paper seems to float above the instrument cluster in front of the lower windshield area. Look-down angles can vary between -2° to -5°. The C-HUD FoV of 3°x1° up to 6°x2.5° is relatively small. Free of the physical and visible combiner restrictions, the W-HUD image position can be located higher - which means a smaller viewing angle. The position of the W-HUD picture is floating above the bonnet, in front of the street. With a FoV from 4°x2° up to 10°x2.5° drivers subjectively perceive almost double the width of the virtual image. This picture can cover critical traffic environments outside of the windshield. AR-HUD images can feature FoVs of up to 17°x6.5°. Their look-down angle can achieve 0° - which means the image center can be perceived in front of the horizon.
Based on the most frequent use cases this represents a theoretical value, since the most useful assistance depictions while driving is needed in between the horizon and the hood. This can be achieved with a look-down angle of 1°. Even the fact that the virtual image is optically perceived only at 7m does not hinder the driver to bring this information in context with objects that are much further away. The human eye’s accommodation (near-far-adjustment) is adjusted to infinite above 4m meters and the human brain is the greatest “tool” to make sense out of presented visual information. Both conventional HUD solutions - C-HUD and W-HUD - can feature driver information and ADAS representations within the primary sight line. Nevertheless, C-HUD and W-HUD depictions are not fully augmenting reality one is driving through. In other words, these two technologies are showing concise information in a different location. Navigation routing symbols for instance need to be cognitively processed by the driver and applied into the actual situation. This procedure affords milliseconds of time and attention resources (Morita, National Traffic Safety and Environment Laboratory).
In order to reduce cognitive workload while driving, AR-HUD systems could make a difference to reduce distraction and make driving safer. Driver information, assistance and attention management can be shown not only in drivers’ line of sight, but also matching the real traffic environment – a true augmented experience. Navigation routing for instance can directly point into streets, obstacles can be directly highlighted, and adaptive cruise control can be directly brought into context within the FoV. The ideation of possible functional use cases and their benefits is a huge topic within HMI design experience but might be too much to be listed within this publication. Also the ongoing technical efforts to increase the FoV up to the size where the whole windshield might become an AR-HUD display in future.