Automotive Trends Drive Vehicular Displays

Automotive Trends Drive Vehicular Displays

Display Week will offer many opportunities to learn about the latest advancements in vehicular displays. Trends such as larger and more plentiful automotive displays, HUDs, the connected car, ADAS, and autonomous driving are informing these new developments.

by Ken Werner

SID’s Display Week 2017, held in Los Angeles, May 22–26, at the Los Angeles Convention Center, will feature an automotive Market Focus Conference on Tuesday, May 23, and a substantial vehicle displays technology track on May 24 and May 25. There will be plenty to see and hear at these events; in the meantime, let’s look at some of the ongoing developments in automotive systems for perspective on where and how the various technologies that will be discussed at Display Week might – or might not – fit in.

More and Bigger Displays

The average size of automotive displays is increasing, with center-stack displays heading toward 8 inches for now. As Tesla has demonstrated, it can, and will, go much larger, and on cars far more affordable than the first two Tesla models. Of course, sizes and configurations will vary with the automotive interior design approach, vehicle market position, and trim level.

Automotive OEMs are also planning on more displays per vehicle (Fig. 1).

Fig. 1:  There is no such thing as too many displays in a car, Mitsubishi seems to be saying with this PHEV concept car. Rendering: Mitsubishi Motors

One near-term application for luxury vehicles is replacing side-view mirrors with cameras and interior displays. This will get the mirror structures out of the airflow for reduced wind noise and slightly improved gas mileage. In addition, designers are looking forward to getting the mirrors off their carefully sculpted surfaces. A more fanciful idea is putting displays on the B and C pillars, which will show camera images of what is on the other side of the pillars, thus making the pillars “transparent” and doing away with their blind spots. (For more about these “transparent” pillar displays, see the article, “Plastic Displays Will Play a Major Role in Automotive HMIs,” in this month’s issue.)

Most important, though, is that the display suite must support increasingly complex automotive systems and various levels of connectivity and autonomy. That means that the displays must work together with other input/output technologies such as touch, audio, gesture control, and haptics to create a reliable and effortless human-machine interface (HMI). Given the increasingly multi-modal nature of these interfaces, system designers are calling them MMIs (multi-modal interfaces). There will be presentations at Display Week evaluating the relative effectiveness of different modal combinations and implementations.

Touch-panel makers worry about how to make touch panels reliable in automotive environments, but no matter how reliable they are, a driver reaching to touch a center-stack panel will be distracted and may have to remove his eyes from the road. Here is one place where haptics developers see an opportunity, both for displays and for soft buttons on steering wheels.

I did not see any papers on voice recognition in the program, but its development for automotive use is inevitable because it solves the problem of reaching for soft buttons while driving.  Given the rapidly increasing sophistication of digital assistants using AI and voice recognition, we are sure to see voice recognition’s increasing use for information inputs and control beyond phone dialing.

Heads Up!

Automotive original equipment manufacturers (OEMs) and suppliers see a big future for head-up displays (HUDs), especially in combination with augmented-reality (AR) systems. Many developmental systems and components were shown at the Consumer Electronics Show (CES)  in January. There has been speculation that when HUDs get good enough, there will be no need to show the same information on an instrument cluster. Certainly removing the cluster will free up precious space behind the dash.

Projection HUDs do face challenges with image size, viewing angle, eye box, and the volume the projector occupies under the dash, but automotive suppliers such as Continental are working on those problems vigorously. Although it is likely that projection will be the dominant HUD technology, alternative approaches are being demonstrated. Among these are Lumineq’s thin-film electroluminescent (TFEL) display sandwiched inside the windshield and LG Display’s transparent OLED mounted on the dash between the windshield and the driver, but these displays are early in their development and have challenges of their own to overcome.

Developmental issues aside, HUDs are fated to become commonplace in vehicles for many market segments, and the instrumentation and AR content they show will become far more extensive.

ADAS

Advanced driver-assist systems (ADAS) – such as automatic emergency braking, lane keeping, adaptive cruise control, and self-parking – are proliferating now and have consumer buy-in rates between 5 and 20%. In other words, up to 20% of consumers currently are willing to pay significantly more for their new vehicles to include these options. In the US, back-up cameras and sensor alerts had a buy-in of 80% in 2016, assisted by a National Highway Traffic Safety Administration requirement that they be included on new light vehicles (cars and light trucks), phasing in by mid-2017 and required on all new light vehicles by mid-2018.

As is well known, original equipment back-up cameras generally send their output to the vehicle’s center-stack display. Aftermarket cameras generally send theirs to a separate display. The cameras usually require power from the vehicle’s electrical system, and also require that a video cable be run from the camera to the display, which also requires vehicle power. Recently, wireless back-up camera systems have been introduced. The camera is integrated into a replacement license-plate frame and uses battery power, which may be supplemented with solar cells. The signal is carried forward by WiFi or Bluetooth, and in two recent examples, the display is the user’s cell phone. In at least one case, data from the vehicle’s on-board diagnostics (OBD)-II port tells the system when the car is in reverse.

Making an aftermarket system wireless is an excellent idea, but using the cell phone for the display is not, and is the lazy way out. Readers of Information Display will be able to think of many ways to incorporate a dedicated display. One way is to integrate the display into a replacement rear-view mirror. Since many OEM mirrors are powered, there will often be a convenient source of 12 volts for such installations. (This has in fact been offered for a number of years.)

John Sousanis, managing director at WardsAuto, outlined the appeal of ADAS at a recent conference: “ADAS has a very efficient path forward in the industry. It has a simple goal: Save lives immediately. ADAS doesn’t require changes to the infrastructure, doesn’t require connectivity, doesn’t take away the driver’s independence; the change it asks of us is incremental and largely situational. And it’s very cost efficient.

“Furthermore, ADAS fits within the existing industry model, it has lots of room for independent supplier innovation, [and] innovation can continue past the initial installation with software and algorithm updates and advancements so that it’s an ongoing process.”

There are significant opportunities here, both for original-equipment and aftermarket products, some of which require or could make use of displays.

The Connected Car

ADAS may be appealing right now because current implementations don’t require connectivity, but limited connectivity is already here, and more – a lot more – is coming. The 2017 Infiniti Q50, for example, uploads selected vehicle and infotainment data by default. Some high-end radar/lidar detectors use uploaded data crowd-sourced from all of the brand’s users to create a “map” of fixed-location radar sources. Such sources include alarms from automatic doors and other non-law-enforcement sites, and can thus be safely ignored, which the radar detector does. (Law-enforcement lidar measures the speed of a target by illuminating that target with a laser light.)

But the next wide-spread type of connectivity will be vehicle-to-vehicle (V2V), which will permit more effective collision avoidance, traffic merging, adaptive cruise control, and other functions. Next will be vehicle-to-network (V2X), which will also be used in conjunction with V2V. At this point the applications become almost mind-boggling. The German company HERE, which maintains a cloud-based digital mapping service, is collaborating with New York University’s Multimedia and Visual Computing Lab on an HD live map program. Eventually, connected vehicles will continually update the map in real time to maintain car-to-map precision within 10 cm (4 inches).

BMW will be installing data-generation technology from the Israeli company Mobileye in its cars beginning with the 2018 model year. The data will be uploaded to HERE, which will use the data to update its real-time cloud service for automated vehicles.

Another type of application is “broad data.” One example is uploading speed and brake-point data for many vehicles at a particular turn. That data could be used for reprogramming the safe distance between autonomous cars at that turn.

Cars with lots of artificial intelligence, whether they are autonomous or not, will require software updates. It is clearly more attractive to download the updates to all applicable vehicles rather than sending letters to their owners and hoping that they bring their vehicles into the dealership. OEMs and artificial intelligence (AI) companies are very concerned that once an upgrade is decided upon, all vehicles be updated reliably and more or less at the same time.

Autonomous Vehicles

The Society of Automotive Engineers defines five levels of automation for vehicles. All levels will require sensors for situational awareness (Fig. 2). ADAS bridges Levels 1 and 2, depending on whether we’re talking about a single function or the integration of two or more functions. In either case, the driver is always responsible for controlling the car, even if the car may intervene under special circumstances, such as automatic emergency braking.

Fig. 2:  Autonomous vehicles may mount a full array of sensors for situational awareness. The outputs of the sensors will be combined through what is called sensor integration, and the results analyzed with an AI system located either on board or in the cloud, which will identify surroundings and threats. Photo: Ken Werner

Level 5 is full autonomy under all conditions: set your destination and go to sleep. Level 4 is full autonomy under certain sets of conditions, such as “highway driving in clear weather with no more than moderate traffic density.”

That leaves Level 3: autonomous driving, but without the expectation that the car will always have adequate situational awareness. When the car loses adequate situational awareness, say when a light snow obliterates highway lane markings, it passes control back to the driver. And that’s a problem. People are not good at switching tasks quickly, and that’s especially true when a driver has lost situational awareness because he’s texting, reading, or video conferencing. So, when the car tells the driver to take over, the driver must realize what the car is instructing him to do, withdraw from the task he’s performing, spend the needed time to attain situational awareness, resume control, and – hopefully – take appropriate actions. And by the time the driver does all that, there is a reasonable chance that the car will be in an even worse situation.

After studying how this process might be reliably accelerated, many investigators have concluded that Level 3 should be skipped, while others still feel it’s feasible. Ford made its position official in mid-February. It will skip directly to Level 5, and will introduce driverless cars in 2021 (Fig. 3). Ford is guaranteeing a human driver won’t be able to foul things up by omitting the steering wheel, brake, and gas pedal from these cars, said Ford product development chief Raj Nair, according to Bloomberg Technology. These cars may very well be Uber-style taxis.

Fig. 3:  A Velodyne lidar sensor is mounted on a developmental Ford vehicle. Lidar is regarded as an essential technology for autonomous vehicles. Ford recently announced it would field Level 5 autonomous vehicles in 2021. Photo: Ken Werner

Some manufacturers, including BMW and Audi, will be rolling out Level 3 cars next year that give drivers at least 10 seconds to take over from the system. That may be enough time for the driver to do his task-switching and acquisition of situational awareness, but can the system guarantee that nothing untoward will happen during those 10 seconds? This will be interesting.

OEMs feel that displays are critical for introducing drivers – or, at Level 5, owner/passengers – to the systems and developing confidence in them. According to this thinking, it is important for drivers to know what the system knows and how it is making its decisions, so the driver can learn to trust the car.

And there’s another entire area for enhanced displays. When your car does the driving, what are you going to do? See Fig. 4. (Suggestions: video conference, email, text, watch a movie, read something.)

The car must entertain and inform you, but the ways in which it does that will be subject to constant change. When OEMs can no longer differentiate their products on the basis of the conventional driving experience, they will have to innovate elsewhere. With many concept vehicles featuring displays that conform to interior curved surfaces, there’s no question that flexible OLED will play an important role. Continual novelty will be essential.

Fig. 4:  When the car drives itself, what do you do? You might want to watch the multiple wide-aspect-ratio door-mounted displays. Photo: Daimler Benz

Displays Beyond the Imagination

Autonomous vehicles will talk to each other, talk to the cloud, and talk to you. But the communication will go both ways. Particularly for semi-autonomous vehicles, it will be useful for the car to observe you through video and perhaps bio-sensing to determine if you’re awake, alert, and looking out the windshield as you should. But even in full autonomous vehicles, a vehicle that can determine your mood could, for instance, provide you with the appropriate music or video, or suggest that it take you to a bar or coffee shop, as is appropriate. Perhaps OLED lighting over the surfaces of the car’s interior would supply appropriate mood lighting.

OEMs, Tier 1 suppliers, and university researchers are reimagining the automobile. Those reimaginings include displays, including displays that do not yet exist.  •


Ken Werner is Principal of Nutmeg Consultants, specializing in the display industry, manufacturing, technology, and applications, including mobile devices and television. He consults for attorneys, investment analysts, and companies re-positioning themselves within the display industry or using displays in their products. He is the 2017 recipient of the Society for Information Display’s Lewis and Beatrice Winner Award. You can reach him at kwerner@nutmegconsultants.com.