Published

Continental: Heads Up

Working toward improving driver safety and the overall driving experience, Continental is developing advanced technology that makes heads-up displays even more useful. Why heads-up displays? Because drivers keep their eyes where they belong, on the road ahead.

Share

Heads-up display (HUD) technology is a driver information system that, says Eelco Spoelder, head of the Instrumentation & Driver HMI business unit, Continental Corp. (continental-corporation.com), is poised for significant growth. Continental is no stranger to the technology, as BMW first offered a Continental color HUD back in 2003. Today, Spoelder says, there are 10 global OEMs that offer HUD technology on a total of 19 models. For 2014, Spoelder reckons, the total number of HUDs produced will be on the order of 1.5-million units.

But about that growth: “By 2018, this figure will probably triple to around 5-million units worldwide.”

And he says that because of (1) the company’s long-time involvement in the technical development of HUD and (2) the company’s manufacturing plant in Babenhausen, Germany, where they make—from producing the mirrors to machining the die-cast housings to assembling the product—HUDs three shifts per day, five days a week for customers including BMW, Audi and Mercedes, Continental has a solid competitive position to participate in that growth.

It is worth noting, however, that while HUDs may be associated with BMWs and Mercedes and Corvettes and other vehicles that have tended to have price tags that make the option not much of a blip on the monthly payment, Spoelder says than an objective that they have is to also develop systems that will make HUD tech attainable for vehicles with more modest sticker prices. He says that this evolution is another step similar to what happened when they rolled out their second generation HUD system in 2010, which allowed midsized cars to offer the technology at a reasonable price.

HUDs Shrink; Market Expands
In 2015, Continental will add a “combiner” HUD to its lineup, which is designed specifically for smaller, less expensive vehicles. Whereas a conventional HUD makes use of the windshield of displaying information (even though the information appears to be some 2.5 meters ahead of the windshield, not on it), the combiner makes use of a plastic or glass that is attached to the top of the instrument panel directly in front of the driver. This takes the place of using the windshield for the information and it facilitates a significantly smaller package for the HUD system, requiring as little as half the ordinary volume required, or only about two liters.

Spoelder says that when that 5-million unit mark is reached, the overall market will likely be split in half, the windshield HUD accounting for half and the combiner the other half. And Spoelder goes on to say that HUD technology is really on an upswing, with more development to come. 

Augmenting Reality
But the big story—bigger than the non-trivial market that HUD represents—is that there is a large group of people within Continental—ranging from physicists to cultural anthropologists, from graphic designers to electronics experts—who are developing an Augmented Reality Heads-Up Display (AR-HUD). Speaking to this technology, Dr. Pablo Richter, who is with the Continental Interior Div., says, “The AR-HUD optical system enables the driver to see an augmented display of the status of driver assistance systems and the significance of this information in their direct field of view.” He adds, “As a new part of the human-machine interface, the current, pre-production AR-HUD is already closely connected with the environmental sensors of driver assistance systems, as well as GPS data, map material, and vehicle dynamics data.”

While it is at a pre-production stage right now—meaning that the trunk of the Kia K9 sedan is chock-full of processors and as we drive the car, Dr. Richter is in the back seat, monitoring how things are operating—plans call for the AR-HUD to be ready for serial production in 2017.

The “augmented” part means, in effect, “additional.” There is more information provided by this system than is the case of a conventional HUD. This information is supplemental. So for a conventional HUD, you might see your speed. With an AR-HUD, that doesn’t go away. It is still there. But (1) it is positioned in virtual space in a different place and (2) it is joined by additional information. In the case of the Continental demonstration vehicle, the additional information relates to adaptive cruise control (ACC), route guidance from the navigation system, and lane departure warning (LDW).

As regards the projection planes, the conventional HUD information is considered “near.” That is, once the driver orients herself so that the HUD display is viewed in a centered way (i.e., a given driver will adjust the seat position and height for purposes of driving the vehicle; consequently, there needs to be an adjustment vis-à-vis where the information is projected on the screen, which can be accomplished through the use of a dial on the instrument panel). This is known as establishing an “eye box.” The near plane appears to the driver 2.4 m away, or, in effect, at the end of the hood. The information is in an area that is 210 x 42 mm. It is produced by a picture generating unit (PGU) that consists of a thin-film transistor (TFT) display that’s backlit by LEDs. This is then bounced off of a curved mirror that both enlarges the information and “places” it at the end of the hood.

Then there is the second, further layer, the “augmentation” layer. It appears to be 7.5 m in front of the driver. Stephan Cieler, who worked on the human-factors setup for the AR-HUD, says that the distance is important because beyond 7.5 m the information projected would interfere with the vehicles ahead and might interfere with the driving tasks.

Digital Mirrors
Again, this is done with mirrors. Specifically, a digital micromirror device (DMD) that is at center to the AR-HUD PGU, or “picture-generating unit.” DMD is technology from Texas Instruments (ti.com) that is similar to that TI developed for DLP Cinema, its digital approach to showing movies; a DMD for a commercial film has up to 8.8-million microscopic mirrors. In the case of the AR-HUD, the DMD, an optical semiconductor, includes thousands of tiny mirrors that are electrostatically tilted. The mirrors are illuminated by three primary-colors LEDs—red, green, blue—with some mirrors reflecting the light and others allowing it to pass through (each is dedicated to a particular color). This then goes to a focusing screen, then is reflected onto a larger mirror, then on to the windshield. The dimension of the augmented viewing area is approximately 130 x 63 cm.

Making it possible for the augmentation of reality in front of the driver is the “AR-Creator,” which is based on a 1.2-GHz quad-core processor. It takes information from the radar sensor used for adaptive cruise control, a Continental mono camera used for lane keeping and object detection, and an “eHorizon,” which is essentially information generated from the navigation/GPS system (they are looking at using information from things like vehicle-to-vehicle communication and vehicle-to-infrastructure, once this becomes available). Because the vehicle is moving, they use a Kalman filter algorithm (a linear quadratic estimation) that helps determine the future state of affairs (with “future” being on the order of 80 ms).

So, all this said, what does the driver see?

In the case of adaptive cruise control (ACC) setting, the vehicle in front is marked with a green crescent below its rear fascia, then there are a series of blue trapezoids decreasing in size from the front of the car being driven to the green crescent, with each of the trapezoids signifying distance based on time (between 1 and 1.2 seconds). Should the driver decide to use the accelerator and override the ACC and the car gets too close to the vehicle ahead, then that green crescent turns red.

For lane-keeping assist, there are red “cats-eyes” that are a series of red dots that appear “on” the road, on the right or left side, depending on where the car is veering from its intended path. These are based on the road-mounted reflectors that are perhaps more familiar to drivers in Europe than in the U.S. (Supplementing the visual cues in the Kia 
K9 are haptic devices in the seat cushion that provide a left or right vibration.)

Then there is the navigation system. In this case, there is a series of blue arrows used to show the route. Just image a series of shapes like this ^ in blue, stacked in the direction that one is to drive, shifting to the left or right in the case of when one is to make a turn. They call it a “fishbone.” Cieler says that when working on the interface they ran a number of tests, determining whether it might be better to have something like a solid carpet (think: yellow brick road) in front of the vehicle or solid arrows in front of the vehicle. These proved not to be as useful for the driver. The mantra used during the development of the human-machine interface was, Cieler says, the phase generally associated with Mies van der Rohe: “Less is more.” The goal is not to overload the driver with too many inputs but to make the information useful and/or actionable.

Making It Smaller
Speaking of “less,” one of the challenges that Eelco Spoelder says Continental faces as they move toward the AR-HUD series development mark in 2017 is that of size. A conventional HUD system requires some 4 liters (or 244-in3) in the instrument panel. That’s about the size of a football. Spoelder says that they’re working toward an AR-HUD system that would be about 11 liters in size (or 671-in3). That’s about the size of two soccer balls. He says that because they have more than a decade’s worth of experience in the development, engineering and manufacturing of HUD systems, he is confident that they will be able to achieve the smaller size for the AR-HUD.

(One of the benefits of the combiner HUD is that it is the most compact of all, requiring only about 2 liters (122-in3), which allows deployment in compact cars.)

Space notwithstanding, Spoelder is convinced that AR-HUD is a key enabler as OEMs move more toward automated driving systems: “We are certain that AR-HUD technology will make it even easier for future ADAS [advanced driver assistance systems] functions and automated driving functions to gain acceptance among end customers.” Why? Because they feel that even if the vehicle is partially or fully driving itself, people still want to know what’s going on. And by having augmented reality in addition to the now-conventional heads-up readouts, there is a greater potential awareness. 

Gardner Business Media - Strategic Business Solutions