An Impact Analysis Method for Safety-Critical User Interface Design
Julia Galliers, City University, London,UK.
Alistair Sutcliffe, UMIST, Manchester, UK.
Shailey Minocha, Open University, Milton Keynes, UK.
The paper An Impact Analysis Method for Safety-Critical User Interface Design describes a method of modeling safety-critical user interfaces based on Bayesian belief networks (BBN). The background comes from safety-critical analysis methods such as Failure Modes and Effects Analysis (FMEA) and Technique for Human Error Rate Prediction (THERP), which acknowledge that environmental, human, and social contextual factors influence the propensity for system failure but give no systematic way for reasoning about such factors. BBNs provide formal techniques for causal analysis of the sources of potential error in a system and its environment. The BBN model is composed of cause-and-effect nodes to which Bayes’ theorem is applied to calculate probabilities of failure given a set of input measures for the influencing factors. The model described in the paper is a preliminary generic model that accounts for operators’ aptitude, knowledge, and training; their motivation; and possible fatigue resulting from environmental conditions such as comfort, time pressure, and interruptions. J.T Reason’s theory of human error informed the design of the BBN, which predicts the probabilities of mistakes and slips given an input scenario of the factors.
Also modelled in the BBN is the user’s task in terms of complexity, as well as the user interface design, which is expressed as a high, low, or medium level of quality. The model is run by loading it with an operator-environment scenario and then walking through each task interaction, described in terms of complexity and quality of user interface. A set of sensitivity analyses can be carried out to determine which influencing factors may cause higher probabilities of mistakes or slips with actions of different complexity profiles. The next step is to link this to design of safeguards in the user interface. We adapted D.A. Norman’s 7 model of action, which uses a walkthrough method that indicates problems at each stage (specify action, execute action, etc.) with general guidelines from "Safeware" by N. Leveson (1995) to prevent slip- or mistake-type errors. The walkthrough model also includes a side loop for system failure diagnosis and repair with guidelines for corrective action and containment of problems. The method and use of the BBN model is illustrated by a case study analysis of user interface safety requirements for a scientific instrument control system.
This approach does, however, come with a health warning. The BBN model is a preliminary hypothesis created by the authors’ judgment, although it does at least embed cognitive theory. The model has not been validated, and there are serious problems in doing so given the combinatorial explosion of influencing factors. Hence, we advocate the use of BBNs as a sensitivity analysis instrument to explore which influencing factors may be more critical in a certain task domain. That does not obviate the need for further calibration and improvement of our model. Future applications need to be specialized for a particular domain in which historical evidence exists to enable model calibration. What we have produced is a novel process and proof of concept in safety-critical requirements engineering for interactive systems on which others can build to improve the accuracy of the model’s predictions and utility of the design advice.
©2000 ACM 1072-5220/00/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2000 ACM, Inc.