Authors: Jonathan Grudin
Posted: Thu, July 17, 2014 - 4:50:11
Some graphic artists and designers who spent years on the edges of software development describe with bemusement their decades of waiting for appreciation and adequate computational resources. Eventually, visual design soared. It has impressed us. Today, design faces complexities that come with maturity. Cherished aesthetic principles deserve reconsideration.
An enthusiastic consumer
People differ in their ability to create mental imagery. I have little. I recognize some places and faces but can’t conjure them up. The only silver lining to this regrettable deficit is that everything appears fresh; the beauty of a vista is not overshadowed by comparison with spectacular past views. I’m not a designer, but design can impress me.
The first HCI paper I presented was inspired by a simple design novelty. I had been a computer programmer, but in 1982 I was working with brain injury patients. A reverse video input field—white characters on a black background—created by Allan MacLean looked so cool that I thought that an interface making strategic use of it would be preferred even if it was less efficient. I devised an experiment that confirmed this: aesthetics can outweigh productivity [1].
Soon afterward, as the GUI era was dawning, I returned to software development. A contractor showed me a softly glowing calendar that he had designed. I loved it. Our interfaces had none of this kind of beauty. He laughed at my reaction and said, “I’m a software engineer, not a designer.” “Where can I find a designer?” I asked.
I found one downstairs in Industrial Design, designing boxes. As I recall, he had attended RISD and had created an award-winning arm that held a heavy CRT above the desktop, freeing surface space and repositioned with a light touch. I interested him in software design. It took about a year for software engineers to value his input. Other designers from that era, including one who worked on early Xerox PARC GUIs, recount working cheerfully for engineers despite having little input into decisions.
Design gets a seat at the table
I was surprised by design’s slow acceptance in HCI and software product development. Technical, organizational, cultural, and disciplinary factors intervened.
Technical. Significant visual design requires digital memory and processing. It is difficult to imagine now how expensive they were for a long time. As noted in my previous post, and in the recent book and movie about Steve Jobs, the Macintosh failed in 1984. It succeeded only after models with more memory and faster processors came out, in late 1985 and in 1986. Resource constraints persisted for another decade. The journalist Fred Moody’s account of spending 1993 with a Microsoft product development team, I Sing the Body Electronic, details an intense effort to minimize memory and processing. The dynamic of exponential growth is not that things happen fast—as in this case, often they don’t—it is that when things finally start to happen, then they happen fast. In the 00’s, constraints of memory, processing, and bandwidth rapidly diminished.
Organizational. The largest markets were government agencies and businesses, where the hands-on users were not executives and officers. Low-paid data entry personnel, secretaries who had shifted from typing to word processing, and other non-managerial workers used the terminals and computers. Managers equipping workers wanted to avoid appearing lavish—drab exteriors and plain functional screens were actually desirable. I recall my surprise around the turn of the century when I saw a flat-panel display in a government office; I complimented the worker on it and her dour demeanor vanished, she positively glowed with pride. For decades, dull design was good.
Sociocultural. The Model T Ford was only available in black. Timex watches and early transistor radios were indistinguishable. People didn’t care. When you are excited to own a new technology, joining the crowd is a badge of honor. Personalization comes later—different automobile colors and styles, Swatches, distinctive computers and interfaces. The first dramatically sleek computers I saw were in stylish bar-restaurants.
Disciplinary friction. Software engineers were reluctant to let someone else design the visible part of their application. Usability engineers used numbers to try to convince software developers and managers not to design by intuition; designers undermined this. In turn, designers resented lab studies that contested their vision of what would fare well in the world. The groups also had different preferred communication media—prototypes, reports, sketches.
These factors reflected the immaturity of the field. Mature consumer products relied on collaboration among industrial design, human factors, and product development. Brian Shackel, author of the first HCI paper in 1959, also worked on non-digital consumer products and directed an ergonomics program with student teams drawn from industrial design and human factors.
As computer use spread in the 1990s, HCI recognized design, sometimes grudgingly. In 1995, SIGCHI backed the Designing Interactive Systems (DIS) conference series. However, DIS failed to attract significant participation from the visual design community: Papers focused on other aspects of interaction design. In the late 1990s, the CMU Human-Computer Interaction Institute initiated graduate and undergraduate degrees with significant participation of design faculty.
This is a good place to comment on the varied aspects of “design.” This post outlines a challenge for visual or graphic design as a component of interaction design or interface design focused on aesthetics. Practitioners could be trained in graphic art or visual communication design. Industrial design training includes aesthetic design, usually focused on physical objects that may include digital elements. Design programs may include training in interaction design, but many interaction designers have no training in graphic art or visual communication. CHI has always focused on interaction design, but had few visual designers in its midst. “Design” is of course a phase in any development project, even if the product is not interactive and has no interface, which adds to the potential for confusion.
Design runs the table
Before the Internet bubble popped in 2000–2001, it dramatically lowered prices and swelled the ranks of computer users, creating a broad market for software. This set the stage for Timex giving way to Swatch. In the 2000s, people began to express their identity through digital technology choices. In 2001, the iPod demonstrated that design could be decisive. Cellphone buddy lists and instant messaging gave way to Friendster, MySpace, Facebook, and LinkedIn. The iPod was followed by the 2003 Blackberry, the iPhone in 2007, and other wildly successful consumer devices in which design was central.
The innovative Designing User Experience (DUX) conference series of 2003–2007 drew from diverse communities, succeeding where DIS had failed. It was jointly organized by SIGCHI, SIGGRAPH, and AIGA—originally American Institute of Graphic Arts, founded in 1914, the largest professional organization for design.
The series didn’t continue, but design achieved full acceptance. The most widely-read book in HCI may be Don Norman’s The Psychology of Everyday Things. It was published in 1988 and republished in 2002 as The Design of Everyday Things. Two years later Norman published Emotional Design.
Upon returning to Apple in 1997, Steve Jobs dismissed Apple’s HCI group and vice president Don Norman. Apple’s success with its single-minded focus on design has had a wide impact. For example, the job titles given HCI specialists at Microsoft evolved from “usability engineers” to “user researchers,” reflecting a broadening to include ethnographers, data miners, and others, and then to “design researchers.” Many groups that were focused on empirical assessment of user behavior had been managed parallel to Design and are now managed by designers.
Arrow or pendulum?
Empowered by Moore’s law, design has a well-deserved place at the table, sometimes at the decision-maker’s right hand. But design does not grow exponentially. Major shifts going forward will inevitably originate elsewhere, with design being part of the response. An exception is information design—information is subject to such explosive growth that tools to visualize and interact with it will remain very significant. Small advances will have large consequences.
In some areas, design may have overshot the mark. A course correction seems likely, perhaps led by designers but based on data that illuminate the growing complexity of our relationships with technology and information. We need holistic views of people’s varied uses of technology, not “data-driven design” based on undifferentiated results of metrics and A/B testing.
I’d hesitate to critique Apple from Microsoft were it not for the Windows 8 embrace of a design aesthetic. Well-known speakers complain that “Steve Ballmer followed Steve Jobs over to the dark side,” as one put it. They are not contesting the value of appearance; they are observing that sometimes you need to do real work, and designs optimized for casual use can get in the way.
My first HCI experiment showed that sometimes we prefer an interface that is aesthetic even when there is a productivity cost. But we found a limit: When the performance hit was too high, people sacrificed the aesthetics. Certainly in our work lives, and most likely in our personal lives as well, aesthetics sometimes must stand down. Achieving the right balance won’t be easy, because aesthetics demo well and complexity demos poorly. This creates challenges. It also creates opportunities that have not been seized. Someone may be doing so out of my view; if not, someone will.
Aesthetics and productivity
Nature may abhor a vacuum, but our eyes like uncluttered space. When I first opened a new version of Office on my desktop, the clean, clear lettering and white space around Outlook items were soothing. It felt good. My first thought was, “I need larger monitors.” With so much white space, fewer message subject lines fit on the display. I live in my Inbox. I want to see as much as my aching eyes can make out. I upsized the monitors. I would also reduce the whitespace if I could. I’d rather have the information.
A capable friend said he had no need for a desktop computer—a tablet suffices, perhaps docked to a larger display in his office. Maybe our work differs. When I’m engaged in a focal task, an undemanding activity, or trying out a new app, sparsity and simplicity are great. When I’m scanning familiar information sources, show me as much as possible. As we surround ourselves with sensors, activity monitors, and triggers, as ever more interesting and relevant digital information comes into existence, how will our time be spent?
Airplane pilots do not want information routed through a phone. They want the flight deck control panel, information densely arrayed in familiar locations that enable quick triangulations. If a new tool is developed to display airspeeds recorded by recent planes on the same trajectory, a pilot doesn’t want a text message alert. Tasks incidental to flying—control of the passenger entertainment system perhaps—might be routed through a device.
We’re moving into a world where at work and at home, we’ll be in the role of a pilot or a television news editor, continually synthesizing information drawn from familiar sources. We’ll want control rooms with high-density displays. They could be more appealing or less appealing, but they will probably not be especially soothing.
Design has moved the opposite direction, toward sparsely aesthetic initial or casual encounters and focal activity. Consumer design geared toward first impressions and focal activity is perfect for music players and phones. Enabling people to do the same task in much the same way on different devices is great. However, when touch is not called for, more detailed selection is possible. Creative window management makes much more possible with large displays. A single application expanded to fill an 80-inch display, if it isn’t an immersive game, wastes space and time.
I observed a 24x7 system management center in which an observation team used large displays in a control panel arrangement. The team custom-built it because this information-rich use was not directly supported by the platform.
You might ask, if there is demand for different designs to support productivity, why hasn’t it been addressed? Clever people are looking for ways to profit by filling unmet needs—presumably not all are mesmerized by successes of design purity. My observation is that our demo-or-die culture impedes progress. A demo is inherently an initial encounter. A dense unfamiliar display looks cluttered and confusing to executives and venture capitalists, who have no sense of how people familiar with the information will see it.
This aggravates another problem: the designers of an application typically imagine it used in isolation. They find ways to use all available screen real estate, one of which is to follow a designer’s recommendation to space out elements. User testing could support the resulting design on both preference and productivity measures if it is tested on new users trying the application in isolation, which is the default testing scenario. People using the application in concert with other apps or data sources are not given ways to squeeze out white space or to tile the display effectively.
Look carefully at your largest display. Good intentions can lead to a startling waste of space. For example, an application often opens in a window that is the same size as when that application was most recently closed. It seems sensible, but it’s not. Users resize windows to be larger when they need more space but rarely resize them smaller when they need less space, so over time the application window grows to consume most of a monitor. When I open a new window to read or send a two-line message, it opens to the size that fits the longest message I’ve looked at in recent weeks, covering other information I am using.
The challenge
The success of the design aesthetic was perfectly timed to the rapidly expanding consumer market and surge of inexpensive digital capability in just the right segment of the exponential curve. It is a broad phenomenon; touch, voice, and a single-application focus are terrific for using a phone, but no one wants to gesticulate for 8 hours at their desk or broadcast their activity to officemates. At times we want to step back to see a broader canvas.
The paucity of attention to productivity support was recently noted by Tjeerd Hoek of Frog Design. The broad challenge is to embrace the distinction between designs that support casual and focal use and those that support high-frequency use that draws on multiple sources. Some designers must unlearn a habit of recommending aesthetic uncluttered designs in a world that gets more cluttered every week. Cluttered, of course, with useful and interesting information and activities that promote happier, healthier, productive lives.
Endnote
1. J. Grudin & A. MacLean, 1984. Adapting a psychophysical method to measure performance and preference tradeoffs in human-computer interaction. Proc. INTERACT '84, 737-741 PDF
Thanks to Gayna Williams for suggesting and sharpening many of these points. Ron Wakkary and Julie Kientz helped refine my terminology use around design, but any remaining confusion is my fault.Posted in: on Thu, July 17, 2014 - 4:50:11
Jonathan Grudin
View All Jonathan Grudin's Posts
Post Comment
No Comments Found