Jonathan Arnowitz, Elizabeth Dykstra-Erickson
Simply following a user-centered design process does not assure good design. Delivering a quality experience means understanding and articulating a good design practice. That means not just slavishly applying a process but critically engaging with the process, from problem articulation to designing, innovating, prototyping, and usability testing.
Designing a good solution depends very much on defining the problem we wish to solve early in the design process. Often, designers are given a solution to explore, rather than a problem to solve. Knowing the difference between the two is critical. Beginning a design effort from the proposed solution can produce a brilliant design, but it can also produce shallow results. Beginning a design effort from problem exploration can produce brilliant results that accommodate the unexpected, provide scalability and generalizability, and in addition can consider unconventional factors that may otherwise go unnoticed. When we begin a design effort, we ask ourselves:
- Is the problem space larger than the solution space?e.g., given the constraints of which we are aware, have we identified more than one possible solution, or are the possibilities extremely limited?
- Have we sufficiently explored the design options to stretch the boundaries of the solution space?e.g., did we try at least three ways to think about the problem?
- Does the problem, and/or the solution, overlap with other problems and solutions?
Often, designers are given a solution to explore, rather than a problem to solve. Knowing the difference between the two is critical.
Product marketing teams develop functional requirements (what a product should do). Designers, by convention, develop specificationsthe blueprint of how the product behaves, and the visual treatment for it. But sometimes we work from specs. In that case, the solution is defined, but the design isn’t necessarily complete or integrated. When we work from specs rather than reqs, we should audit these solutions to ensure an appropriate match between problem and solution space. Don’t assume we can “know” the right answer when we see it; user studies, competitive analysis, brainstorming and collaborative ideation, previous design experience (especially the cataclysmic failures), and intuition all play a part in developing a design approach. Beyond that, “the devil is in the details”expect that some things may change. And expect valid criticism from the most unlikely sources: We can all learn from everyone.
We all too often see designers work on a design and immediately begin to socialize their solution with development and other stakeholders to get their buy in, without first exploring multiple solutions and different conceptual designs. The first or most obvious solution should not be mistaken for the only solution. Well in advance of showing our designs to stakeholders, designers should first reach out to colleague designers and look at other solutions to analyze the operational variables that can be incorporated in a robust design. Premature buy-in from stakeholders makes it extremely difficult to adjust a design once it is well on its way through the development cycle. Stakeholders are decision makers; weighing the advantages and disadvantages of a design proposal isn’t a simple yes-or-no proposition. We should offer stakeholders a few rough ideas at the beginning and be prepared to voice the limitations or opportunities the concepts present. Offering a single solution as the only solution may be our best betbudget, schedule, and technology permittingbut make that determination consciously. Stakeholders often see a solution and opt in, not thinking that there are possible alternatives. Worse, when better alternatives are presented, the socialized one may be preferred because people are already emotionally invested in the solution. We can’t determine the optimal design when only one option is presented. With the wrong solution, previous user studies and analysis could be for naught.
We’ve said a number of times that designers need to understand the technology employed in their designs. We need to work closely with engineering to understand the limitations. But if our task is to move design forward and create something truly new and interesting, the designer needs to push engineering to accomplish things they didn’t think possible. It’s easy to tell when we can’t push (there are usually project managers around who will confirm our suspicions on this point), and harder to tell when we should. Nevertheless, if innovation is to be a distinguishing factor in our design work, then working “within the box” won’t suffice. A novel breakthrough requires luck, circumstance, a great idea that wasn’t previously possible, creativity, and a good deal of perseverance to see it through. We can attest that a single design can change an industry: witness the media frenzy around the iPhone. Some very old ideas, some new: a spectacular ad campaign, and a clear wake-up call to the cell phone interface industry to get busy and compete.
In his classic book on problem solving, The Reflective Practitioner: How Professionals Think in Action, Donald Schon warns practitioners not to “cut the practice situation to fit professional knowledge” by a) becoming selectively inattentive to data, b) attributing failure to personalities and politics, or c) trying to force the situation into a mold which lends itself to the use of available techniques . We do things to maintain our confidence in standard models and techniques, copying solutions simply because they exist, following conventions because they are conventions. A reflective practitioner reserves the capacity to innovate in both method and result.
Prototyping is not a monolithic activity; it is broad and complex, and many things to many people. Effective Prototyping for Software Makers and Thoughts on Interaction Design both acknowledge this . When we undertake prototyping, we should carefully analyze what needs prototyping, then determine the goal our prototype is trying to achieve. Choose the prototyping method and tool on the basis of that analysis. Otherwise, prototyping all too often degenerates into trying to wow the prototype audience with an overly high fidelity prototype. Our prototype might win today but ultimately lose tomorrow as the team buys into a prototype that formalizes too many design decisions too early in the game that can’t later change. And we all know that the later in the process changes are introduced, the more expensive and risky change becomes.
Usability testing is not the single solution to all our usability problems. Marc Hassenzahl, professor for economic psychology and human-computer interaction at the University of Koblenz-Landau in Germany, pointed out at the recent CHI-Netherlands Conference what he observed at one company: Three usability engineers, with the same educational degrees, who work for the same company and perform the same usability tests, each have radically different approaches to their tests. In describing their moderation style, one person described himself as cool and nonevasive like a scientist; another described his approach as lightly prompting, saying it isn’t science but exploration; yet a third described his moderation style as evasive, as a form of collaborative discovery. Hassenzahl pointed out that these three styles yield very different results. In some environments that is very useful, but in most, it’s a big investment with potentially incongruous results that don’t move the design forward. Usability testing needs to examine purpose, goal, potential, opportunity, limits, and other factors, and couple those with the testing method and moderation style.
There really is no absolute predictive process, leaving us all the freedom to become the masters of our processes, rather than being at the mercy of them. <eic>
©2007 ACM 1072-5220/07/0900 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2007 ACM, Inc.