No Section...

VI.3 May/June 1999
Page: 31
Digital Citation

Using the cognitive walkthrough for operating procedures


Authors:
David Novick

Or maybe you’ve developed a kiosk interface for the general public, perhaps to charge fees at a parking garage. The kiosk has step-by-step instructions for use stenciled on its front. Of course, even if things go terribly in following the instructions, no one will get hurt. But the garage owners could lose customers if they keep getting frustrated or lose money, and if people tend to get confused while using the system, an increasingly long line of waiting patrons may show their impatience. The step-by-step instructions constitute an operating procedure, and you hope you’ve got a good one.

So how, absent extensive late-process usability testing, can you improve your system’s operating procedures so that safety and effectiveness are increased?


Cognitive walkthrough leads the designer to consider factors such as users’ backgrounds and level of mental effort.

 


Jonathan Grudin observed [3] that the user’s interface to a computer does not consist of only the hardware and software. From the user’s point of view, the interface includes an assortment of associated elements in the context of use, including documentation, training, and advice from colleagues. From this perspective, you can view the interface of a computer as including the usual screens, buttons, and knobs, as well as other users present, training received, documentation provided, and operating procedures prescribed. Indeed, it turns out that the guiding of users through their interactions in a computer interface can largely be allocated either to the interface itself or to a set of associated procedures. This is what Guy Boy [1] called the interface-procedure duality. To take a simple example, if the stenciled instructions on the garage kiosk were displayed on a computer screen, it would ordinarily be considered part of the computer interface.

Consider designing procedures and documentation for a new, text-based air-traffic control communications system for aircraft to complement the current voice-based system. Figure 1 is an example of a typical draft procedure that I developed for this kind of "datalink" interface. At design time, how could I assess whether this draft procedure and the many others associated with the datalink interface would be effective?

The correspondence between interfaces and operating procedures suggests a solution: adapt known evaluation techniques (as applied to interfaces in the narrow sense) to the evaluation of procedures (considered as part of the interface in the wider sense). Accordingly, this article shows how to use an adaptation of the cognitive walkthrough [7], as revised and extended specifically for the evaluation of operating procedures. I will discuss how I adapted the regular cognitive walkthrough into a cognitive walkthrough for operating procedures (CW-OP) and show an example of CW-OP in use.

Adapting the Cognitive Walkthrough

The cognitive walkthrough, as it has evolved [4, 7, 8], is a usability inspection method for interfaces that originally focused on evaluating a design for ease of learning. It has since been extended to other phases of interaction. The cognitive walkthrough has been adapted to the evaluation of complex interfaces such as a computer-assisted design (CAD) tool [13]. Cognitive walkthrough leads the designer to consider factors such as users’ backgrounds and level of mental effort. It is based on a model of interaction like that of Norman’s stages of user activities [5]. In brief, a task is broken down into steps in the interface, which are analyzed individually in their connections between goals, artifacts, actions, and results. Each step is described as either a "success story" or a "failure story," depending on the outcome of the analysis. Walkthroughs can be conducted either as a group or individually. Designs evaluated in this way should, among other things, reduce the need for training. An extensive body of literature, summarized in [7], has examined the relative effectiveness of the cognitive walkthrough in identifying usability problems.

From the most practical version of cognitive walkthrough [8], I developed, with help from my colleagues at EURISCO and Aérospatiale, a new version adapted to operating procedures. This involved an iterative process of revision and use of forms and instructions adapted to procedures and their documentation. The adaptation of the walkthrough involved taking into account information and constraints that are present in the interface’s context of prescribed and actual use. Whereas a physical interface (such as a hardware–software interface) is designed to specifications that account for things such as the nature of the users and the task, an operating procedure includes these as specific elements. A procedure exists to specify, unambiguously,

  • What the task is,
  • When the task should be conducted,
  • How the task should be done,
  • By whom it should be conducted, and
  • What feedback is provided to other agents [2].

Accordingly, the adaptation to procedures included five significant changes:

  • Dealing with procedural rather than interface steps. Some procedural steps may not involve action in the interface. This addresses the issue of including human–human as well as human–machine interaction.
  • Drawing the evaluator’s attention to the presentation of the procedure in the documentation. This is the way the procedure exists in the context of use.
  • Asking the evaluator to determine explicitly whether training or experience is required for a particular step. These factors are typical justifications for a link between, for example, goals and actions. If training or experience is required, it may indicate that the procedure should be modified or that the documentation be clarified.
  • Looking at whether the procedure correctly executes the intended function in the overall system. This addresses issues of usefulness and safety.
  • Determining whether errors are probable and, if so, whether they would affect safety. In the domain of flight operations, as in the air-traffic control domain considered in Wharton et al. [8] and Novick and Chater [6], safety is the top priority.

How To Use the CW-OP

Like the cognitive walkthrough for the physical interface, the CW-OP is supported by the use of two forms: (1) a cover sheet and (2) a form that presents the success or failure story of each step analyzed. Figure 2 presents the contents of the cover sheet, which in practice is either an A4 or a letter-sized page. Figure 3 presents the content of the form for each step; in use this form is also a standard-sized page. To succeed, the interface needs to be well specified for the evaluation of its associated operating procedures. Note that this may limit the usefulness of the method during early phases of development.

The CW-OP is normally conducted using a group of evaluators. A study validating the CW-OP [6] found that the CW-OP was comparable to the regular cognitive walkthrough in terms of burden on evaluators, and that the CW-OP process clearly identified issues involving the procedural as well as the physical interface. In most cases, evaluators concurred in their assessments. However, as is the case for the cognitive walkthrough for the physical interface, it is considered sound practice to use multiple evaluators. Evaluations can be performed together as a team, or in parallel individually. However, experience suggests that when training evaluators, it is important to have trainees perform some evaluations individually. The idea is to make sure that each person has experience in answering all questions in the CW-OP forms. For example, one trainee evaluator reported difficulty in making the distinction between success and failure. For both group and individual walkthroughs, I found useful the recommendation in [6] that the assumed characteristics of users be written in a place where everyone can see them and, if necessary, update them in common.

The team of evaluators should consist of people who have been trained in human–computer interaction and who have knowledge of or access to the specifications or characteristics of the interface being evaluated. The group could benefit from the inclusion of users, particularly if the evaluation is conducted as a team. I want to stress that multiple evaluators bring a variety of experiences with interfaces and procedures, different types of knowledge about human factors, and different levels of expertise about the interface being evaluated. The range of evaluations for operating procedures that we have looked at suggests that multiple evaluators can provide complementary perspectives on the usefulness and usability of operating .

You may want to ask evaluators to mark sections of the documentation that they read or used during the evaluation process. Otherwise, there may be no record of whether the evaluators used any particular part of the documentation as an aid to understanding the procedures. For instance, the "example" section provided in the draft procedure in Figure 1 would usually not be considered steps subject to analysis, but it would be useful to know if the evaluators had referred to the examples to help them understand the procedure.

Example of Evaluation

The CW-OP was developed as part of a project sponsored by Aérospatiale, the French part of Airbus Industrie. The project involved design and evaluation of a datalink. One of the datalink draft operating procedures I developed, called simply "Respond" (which is too long and detailed to be usefully presented here), consisted of a general technique for creating and sending a response to a message from air-traffic control. This draft procedure had four main parts: build the message, send the message, confirm receipt, and store the message.

Figure 4 shows a cover page filled out for an evaluation that included the "Respond" operating procedure. The cover page indicates that this was one of three procedures constituting a task. For the particular steps, the cover page refers to a specification; it might be better practice to list the specific steps directly in the form.

Figure 5 shows a form for the first step ("Build the message") in the "Respond" procedure, as produced by an individual evaluator. Walkthrough questions 1–4 address the step’s potential points of failure. In Question 2, the procedure’s affordance is analyzed in terms of its representation in the documentation. The evaluator’s annotation "experience" indicated the areas supporting the conclusion later in the form that experience or training would be required. The walkthrough observation items 1–3 address concerns specific to procedures, particularly in safety-critical systems. These items are useful for less-complex systems, too. So if the procedure for a walk-up-and-use kiosk is judged to require training, then the procedure and presumably the interface should be reworked. In this example, the evaluator did not address the "errors" item because it was already evident by that point in the analysis that this step would have to be changed. Observation items 4 and 5 are intended to elicit design alternatives and any other comments while the step and the overall procedure are still fresh in the evaluator’s mind.

The "Respond" procedure was eliminated in the next version of the documentation. The elements of the procedure were incorporated into other, more specific procedures such as "Respond to a message with multiple clearances."

Conclusion

Aside from the direct assessment of usability, the cognitive walkthrough for operating procedures provides insight into usefulness and safety beyond that associated with the cognitive walkthrough for physical interfaces. In particular, the CW-OP’s requirement of a reason for linking goals to actions leads evaluators to determine whether training or experience is required for correct use of the procedure. In the aviation industry, for example, reductions in training required to operate an aircraft would both increase reliability and lower operating costs.

When using the CW-OP, evaluators should be alert to a couple of areas not explicitly handled in the process. First, evaluators tend not to evaluate the names of the procedures, presumably because they are not steps as such. These names, though, are usually users’ main means of access to procedures, and evaluators should consider treating the name as a step in order to make sure the name’s affordance is examined. Second, overall instructions tend not to be evaluated. For example, the draft manual for the datalink interface contained an instruction on when members of the crew should make announcements to each other during procedures. The CW-OP evaluation of the draft datalink operating procedures produced only one comment on this instruction in the session. How to obtain analyses of such overall instructions remains an open question. Similarly, it is not clear how to use the CW-OP technique for evaluation of nonprocedural parts of operating documentation. Nevertheless, the CW-OP is being used to evaluate draft operating procedures and has resulted in specific, useful changes in the operating procedures developed as part of the datalink interface program.

Acknowledgments

This research was supported by a research contract from Aérospatiale Aéronautique. Thanks to Florence Buratto, Meriem Chater, Laurent Moussault, Florence Reuzeau, and Stéphane Sikorski for their participation in and contributions to the research.

References

1. Boy, G. Cognitive function analysis for human-centered automation of safety-critical systems. Proceedings of CHI 98 (Los Angeles, CA, April 1998), 265–272.

2. Degani, A., and Wiener, E. Procedures in complex systems: The airline cockpit. IEEE Transactions on Systems, Man, and Cybernetics 27, 3 (1997), 302–312.

3. Grudin, J. Interface. Proceedings of CSCW 90 (Los Angeles, CA, October 1990), 269–278.

4. Lewis, C., Polson, P., Wharton, C., and Rieman, J. Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. Proceedings of CHI 91 (Seattle, WA, April 1991), 235–242.

5. Norman, D. Cognitive engineering. In Norman, D., and Draper, S. (eds.), User Centered System Design. Lawrence Erlbaum Associates, Inc., Hillsdale, NJ, 1986.

6. Novick, D., and Chater, M. Evaluating the design of human-machine cooperation: The cognitive walkthrough for operating procedures. In Proceedings of the Conference on Cognitive Science Approaches to Process Control (CSAPC 99), Lille, France, forthcoming.

7. Wharton, C., Bradford, J., Jeffries, R., and Franzke, M. Applying cognitive walkthroughs to more complex user interfaces: Experiences, issues and recommendations. Proceedings of CHI 92 (Monterey, CA, May 1992), 381–388.

8. Wharton, C., Rieman, J., Lewis, C., and Polson, P. The cognitive walkthrough method: A practitioner’s guide. In Nielsen, J., and Mack, R. (eds.), Usability Inspection Methods. John Wiley & Sons, Inc., New York, 1994.

Author

David G. Novick, Director of Research
European Institute of Cognitive Sciences and Engineering (EURISCO)
4 avenue Edouard Belin
31400 Toulouse France
novick@onecert.fr
http://www-eurisco.onecert.fr/~novick

Methods & Tools Column Editors

Michael Muller
Lotus Development Corp.
55 Cambridge Parkway
Cambridge, MA 02142
+1-617-693-4235
fax: +1-617-693-1407
mullerm@acm.org

Finn Kensing
Computer Science
Roskilde University
P.O. Box 260
DK-4000 Roskilde
Denmark
+45-4675-7781-2548
fax: +45-4674-3072
kensing@dat.ruc.dk

Figures

F1Figure 1. Example of a procedure

F2Figure 2. Contents of the cover sheet.

F3Figure 3. Contents of the procedure-step form.

F4Figure 4. Example of cover sheet.

F5Figure 5. Example of story for a procedure step.

©1999 ACM  1072-5220/99/0500  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.

 

Post Comment


No Comments Found