Consider this scenario: You are driving along, about to change lanes, when your car suddenly tenses up. The seatbelts tighten. The seat straightens up, the headrest moves forward. As you turn the wheel to the right, the car starts quivering, buzzing from the right side. "Calm down," you say, "I know what I’m doing."
A nervous, skittish car? A car distrustful of its driver? Yes, and often for good cause. Am I serious? Yes: Everything I have just described already exists in some high-end automobiles.
Consider the modern automobile. It is a wonder of computation-multiple CPUs, hundreds of miles of cabling. Automatic this and automatic that. Lots of automatic stuff controlling the engine, but more and more of it affects the driver: automatic transmission, anti-skid braking, stability controls, active cruise control, lane control, even automatic parking. Automatic payment for highway tolls, parking lots, and drive-through restaurants. Navigation systems, entertainment systems, HVAC systemssometimes different for each passenger.
How do we automate sensibly, controlling some parts of the driving experience, but ensuring that drivers are kept alert and informed"in the loop," is the way this is described in aviation safety. How do we warn drivers when they are about to change lanes that there is another vehicle in the way? What about an obstacle on the road, perhaps detected by the vehicle’s systems, but still not visible to the driver? (The auto research labs are experimenting with systems that tell the car what is around the corner.)
What should be done when the car decides the best way to avoid a collision is to accelerate past the danger zone, but the driver steps on the brakes? Modern braking systems already disregard the driver’s actions and instead act according to what the designer has decided are the driver’s intentions. Should the car be allowed to ignore the brake pedal and just accelerate? Should the car prevent the driver from changing lanes when another vehicle is already there? Should the car prevent the driver from exceeding the speed limit? Or from going slower than the minimum limit, or from going too close to the car ahead? All thisand much moreis possible today. Asking the driver, or even giving the driver the relevant information, is not the answer: There simply isn’t enough time.
Cars today can almost drive themselves. Take adaptive cruise control that adjusts the auto’s speed according to the distance of the car in front. Today, it arbitrarily relinquishes control below a certain speed (probably due to concerns from the lawyers). But it could easily control at any speed, including the slow crawl of heavy traffic and stops. Add lane control and automatic toll-payment systems, and the car could continue on a highway for hours while the driver slept. During a recent visit to an automobile company’s research lab in Japan, I was startled to discover that all of these components now exist in commercial or near commercial form. Put them together, and oops, we are training drivers to be inattentive. In other words, the driver is no longer "in the loop." Even the research team was surprised when I pointed this out: They had never put it all together either.
In many of the classical fields studied by engineering psychologists and human-factors engineers, there is a well-known and well-studied problem called overautomation. There have been accidents caused by the poor communication between the automatic equipment and the human operators.
I once argued that the current state of automation in aviation was fundamentally unsound because it was in the middle. Either have no automation or have full automation, I argued, but what we have today is halfway automation. Even worse, the system takes over when the going is easy and gives upusually without any warningwhen the going gets tough. Just the reverse of what you would want. But in an airplane, when everything goes wrong and the plane starts crashing to earth, the highly skilled, professional crew has minutes to respond. In an automobile, the driver might have a second to respond.
The current designs for automobile automation are being done by engineers who are ignorant of the lessons learned from studies of automation. Here we go again. Each new industry fails to learn the lessons of the previous ones. So, once again, here is a field in desperate need of help, yet which does not quite realize it. A field with new lessons to learn, and a lot of very old lessons that have to be taught once again. Unless we, HCI, the profession of interaction, get involved, there are apt to be serious repercussions along the way.
Yes, there are many good behavioral scientists at work on these issues of driver automation in universities, government laboratories, and the research labs of auto companies. A panel at CHI 2003 correctly called this area "The Next Revolution." But we are not being consulted when the engineering decisions are being made. Sure, the research labs are active, but who makes the product decisions? I fear that all the errors of the past, errors in nuclear-power control rooms, in process control rooms, in the control of ships, and commercial aviation will simply get repeated. All of the lessons will have to be relearned for yet another industry.
There is an automobile in HCI’s futureand the sooner the better.
Donald A. Norman
About the Author:
Don Norman wears many hats, including cofounder of the Nielsen Norman Group, professor at Northwestern University, and author. His latest book is Emotional Design. He lives at www.jnd.org.
©2005 ACM 1072-5220/05/1100 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.