In 1996 Cabletron Systems, Inc., an industry leader in providing high-performance computer network hardware and software solutions, formed two cross-functional teams to conduct contextual inquiry studies for two important projects. The primary purpose was to evaluate the Contextual Design methodology. The studies were successful, and both teams agreed that the methodology exceeded their expectations. The participants found the experience both exhausting and eye-opening. For several team members, the interviews provided their first detailed look at real users in their work environments.
In the fall of 1997, a team of software engineers at Cabletron Systems, Inc., was asked to define an interface that would consolidate current methods of managing network devices. Such devices contain information such as their configuration, activity, capacity, load, and performance. After several weeks of analyzing the device management screens from three of Cabletron’s network management products, the team decided that they had an insufficiently detailed understanding of the tasks involved in device management to make the best decisions regarding the UI. After consulting with the Usability Engineering (UE) Group at Cabletron, the team decided to perform a contextual inquiry study of device management. Their goal was to understand the work practices of people who manage various network devices and apply this understanding to the design of a single common User Interface. Specifically, we wanted to know under what conditions a user would want to interact with a device and its attributes, and how she would go about doing that. This project happened to fit well with a proposal from the UE group to develop work practice models representing the majority of our customers’ roles and tasks. Having successfully completed the two aforementioned contextual inquiry projects the previous year, and having recently trained two people in a Contextual Design Leadership Training session offered by InContext Enterprises, the UE team, along with the Device Management team, decided to accept this project. As a bonus, the UE group hired a senior-level specialist with substantial experience in leading teams in Contextual Inquiry.
A multidisciplinary team was assembled to perform the contextual inquiry. The team was led by the manager of the usability group and initially consisted of developers of device management products, a quality assurance (QA) engineer, and three usability specialists. About a month after the project began, a technical writer and a graphic designer joined the team. During the 8 months of the project, three of the developers left the company and were replaced on the team by other developers. The team also added a usability intern. Additionally, the UE group wanted to promote the use of contextual techniques within the company, and therefore deliberately allowed for a rather large team, so that more people could experience the process and the customer contact. Although the changes to the team could have seriously set us back, those who joined the team were able to get up to speed quickly, minimizing disruption.
The project began with a one-day training session in the basics of how to conduct a contextual interview. The UE team leader presented the training, which consisted of both a lecture and role playing.
We wanted our users to "come alive" for people throughout the company.
Fortunately, the team had been given dedicated space in a large training room at Cabletron’s Customer Training facility in Portsmouth, New Hampshire, only 10 miles from our engineering location in Durham. This resource room made it possible to keep all of our data visible on the walls and tables and allowed us to focus on our work without interruption.
One of the usability specialists was responsible for selecting appropriate users from our local customer base and scheduling the interviews. All interviews were conducted in pairs, with one person serving as interviewer and the other as note taker.
Most of the team members felt that they knew little about the boundaries of the domain of device management, as well as the details within that domain. Therefore, we set a rather broad focus for the study: What does "device management" mean to a user? The focus started out as exploratory and became progressively specific. Despite our screening of participants by phone, we occasionally found ourselves interviewing people whose work domain was somewhat out of our focus. In these cases we continued with the interviews, assuming that any data are better than no data, and that if analysis showed that the data didn’t fit our scope, we could always exclude and reexamine them later.
After each interview the team would gather to interpret the data. The entire team was seldom together for the interpretations; typically about half the teambut always one or both of the interviewerswas present. During these interpretations we looked for data for the affinity diagram ("factoids," insights, design ideas, and breakdowns) and sequence models. We extracted work flow models later. We did not look at cultural models and dealt minimally with physical models. If artifacts were available, we looked at them during the interpretation sessions.
Adding Other Sources of Data
In January 1998 the team attended Cabletron’s annual Spectrum Users Conference in Phoenix. The conference is an opportunity for users of Cabletron’s flagship network management system, Spectrum, to hear about new features, products, and partner applications. At the conference we were able to conduct with 14 users retrospective interviews targeted at troubleshooting specific, common networking devices. We encouraged participants to describe a particular troubleshooting task and provided them with screen shots as a reference. However, these interviews were based on their recollections rather than on our observations of their actual work. Therefore, we built a separate affinity diagram for data from these interviews. We found it interesting that similar issues emerged from both the contextual and retrospective interviews.
In addition to the retrospective interviews, some team members who participated in the conference set up and conducted contextual interviews at customer sites in and around Phoenix. Data from these interviews were incorporated into the first affinity.
Telling the Story
By February, we realized that our team had been sequestered in our training facility in Portsmouth for several days during the previous two months. Our coworkers in Durham were beginning to seriously wonder what we were up to (as were training personnel, who had never before seen so many "sticky" notes in one place). Furthermore, we believed that the process we were following to obtain and understand the data, as well as the data themselves, would greatly interest Cabletron’s software development community. We wanted our users to "come alive" for people throughout the company.
The team decided to try an open house approach, in which groups would be invited to visit our resource room. The 2-hour open house followed a simple format. First, participants were briefed on the basics of the contextual inquiry/design process; the purpose of, and main issues revealed in, each of the types of models; and tips on how to "read" the models. We set up stations around the room displaying various types of data. Although people could start anywhere they wanted, we set up the room to have an optimal path. A team member was positioned at each station. At this point in our process, none of our models was consolidated, but rather was based on individual interviews.
To provide a context for our guests to better understand users’ issues and work practices in the affinity diagram, we displayed profiles of our participants with identification codes linking them to their actual words and concerns. A guest could then read a user’s comment and then the user’s profile to learn about her job title, networking experience, and background on her company. Next we looked at sequence models that show users tasks. Another station in our open house room was a wall of kudos, quotations praising Cabletron products and/or services. We also displayed data that we collected at the users’ group conference, findings from a screen survey, and quotations from users and played a video from one retrospective interview.
Next, after hearing our introduction, participants spent the next hour immersing themselves in the data. We gave everyone a handful of self-sticking notes and a pen and encouraged them to write their insights, questions, and design ideas and post them near the data that triggered their thoughts.
An hour later we gathered everyone together for a 30-minute discussion. We asked questions such as "What did you see that surprised you?" and "What did you see that either reaffirmed or contradicted what you believed when you entered this room?" For the most part, the discussions were lively, although some groups seemed more lively than others.
We hoped that participants would be intrigued by the process, but more important, that they would gain a deeper understanding of our users and their work practices. Furthermore, we hope that this introduction might inspire the participants into new ways of thinking about how to support our users and their tasks. Initially we planned to hold a few open house sessions focused on the managers of our team members, as well as the Device Management engineering team. However, people started to ask to see our work, so we ended up conducting more than 15 open houses (approximately 150 people) throughout February. The groups represented included engineering, QA, training, curriculum development, technical writing, and product support.
We recorded all the questions and comments provided by the participants, which revealed that there were some surprises, disappointments, and quite a few questions. Specifically, participants were surprised to see that users rely on myriad tools to do their work, including multiple databases. Some were surprised at the number of our users who are relatively new to our network management products, as well as to the field of networking. One fact that surprised our team as well as the Open House participants is that device management is somewhat of an artificial boundary; users tend to lump all their activities into the category "network management". In fact, the term "device management" seemed unimportant and not particularly meaningful to users whose work responsibilities included both device and network management tasks. Here is how one of our development managers responded to this finding:
One fact still impresses me. We started out our task to understand device management, but now I think that we have a more fundamental view of the customer that is much different than the one that we had fabricated. Just because our product has a component for device management doesn’t mean that the user breaks up their tasks along the same boundaries.
Everyone who contributes to a product bears some responsibility for making the product support our users and their work.
Some participants expressed disappointment that certain parts of our products still seemed to have usability issues, despite our ongoing work to improve them. Others were disappointed that users don’t always use our tools the way we would expect them to. It was clear from the data that once users have difficulty with a product or process, it’s an uphill battle to earn back their confidence. Some participants also noticed that users tend to develop habits that might be difficult to break. The team’s purpose in having the discussion be part of the open house was to let the participants do the talking. However, we tried to explain that although the data might speak words contrary to what the participants might have believed, the participants were in a much stronger position to address the issues.
Finally, participants had many questions about the data, mostly related to the particular part of the product they were working on. Questions included What do users want in terms of security? What 20 percent of the functionality might satisfy the needs of 80 percent of the users? Who uses the documentation and how do they use it? and How will we use all this information to actually drive design? We addressed the last question by briefly describing the Contextual Design process and impressing on the participants that everyone who contributes to a product bears some responsibility for making the product support our users and their work practices.
Gathering More Data
Before we could consolidate the models, we had another opportunity to collect more data, this time through another project. In March one of the usability specialists needed to do some work practices evaluation at one of Cabletron’s large customers. We used this opportunity to gather more data. Team members conducted 15 interviews at the customer site. Because of time constraints, we transcribed and interpreted the interviews most related to our focus, knowing that we could go back to the untranscribed tapes at any time if necessary. These data were analyzed and incorporated into our models. We currently have models for approximately 15 users, representing about eight companies.
Consolidating the Models
After gathering and interpreting all the data, we spent the next few weeks creating the consolidated sequence models and the consolidated work flow model. We divided into two groups (with some overlap) to perform the consolidations, with one group working on sequences and the other working on work flows.
Visioning and Storyboarding
We asked a few developers if they would join us for a week of visioning. It was an energizing, exciting, and insight-filled week. During the week we immersed ourselves in the data, extracted the issues, brainstormed solutions, prepared a vision of a selected idea, evaluated and refined the vision, and created storyboards. The developers were especially impressed with the synergy of the multidisciplinary team. There were some tense moments when we had to stop and redefine the words we were using, or just take a break to cool off, but the members of the team agreed that those tense times often led to the most satisfying breakthroughs. We ended up with a storyboard that addressed more issues than we had initially intended. This was likely because we had truly immersed ourselves in the user data, and so carried the user issues around with us in our minds, which then applied them to our ideas.
After the week of visioning and storyboarding we refined the storyboard and a team member redrew it. We scheduled meetings of groups of 3 to 5 people to show them our process and result. Our purpose was threefold: (1) to demonstrate the value of the process on contextual inquiry and Contextual Design; (2) to show a vision, in the form of a storyboard, that we know will meet the needs of our users; and (3) to facilitate better communication between and within groups. We deliberately chose for the small groups people from various disciplines and project groups. The discussions in most of these groups were lively and helped the team to better understand the perspectives of different groups on how this vision either fit or did not fit into their current understanding. We are currently working on incorporating the concepts from our Contextual Design activity into future releases of our network management products.
In carrying out this project, the team learned some powerful lessons. As is usually the case, we did some things that we would not do again and discovered some invaluable optimizations for future contextual inquiry and Contextual Design projects, as follows.
- Transcribing interviews, although tedious and time-consuming, is worthwhile. Next time we will consider hiring a transcription service to perform this task.
- Having a clear focus is helpful in collecting data. Our focus started out broad, and therefore collected a lot of rather disparate data.
- Having developers participate in the interviews and interpretations is essential to understanding the user’s actions and intents, since they are more familiar with the product and its functionality.
- Planning is vital in keeping the team’s momentum, enthusiasm, interest, and commitment levels high.
- Reporting interim results to stakeholders outside the team takes a lot of time but is a necessary and very valuable part of the process.
- Relocating the resource room closer to developers, to encourage more consistent participation in the process, would be beneficial, especially when beginning the design phase.
Perhaps our most important lesson was the necessity of keeping the engineering community aware of the information as it becomes available. Contextual inquiry and Design can take some time, and data collection and analyses should not occur in a vacuum. We found that many of the developers wanted the data to appear in a more digested form. The question commonly heard was "This is all good, but what does it mean to me?" When the team is immersed in the data daily for several weeks, it is easy to forget that it takes time and context to comprehend all the data. Another comment heard frequently at the open house was that the amount of information in the room was "overwhelming."
In retrospect, we would strongly recommend the Open House approach but would suggest that the data be presented more concisely for example, lists of major issues and breakdowns or lists of task sequences, followed by the detail of the sequences. It would even be helpful to take a common (consolidated) sequence and display the current product (and other application) screens used in completing that sequence, so that developers can easily map the data to their product.
At the time of this writing, we divided the team into two groups, each of which is working on a project derived from our Contextual Design work. One group is delving more deeply into the original focus of device management, and the other group is working with project teams involved in applications that support troubleshooting. Our ultimate goal is to provide the software engineering organization with a framework within which they can design products and features that address the work practices and issues of our users.
The author would like to thank all the team members for their long hours and hard work, especially considering that most, if not all of them, had to do this project in addition to their normal workload. Team members include the author, Anthony Bangrazi, Joshua Corman, Rebecca Dickie, Daren Dulac, Joan Freed, Joe Greenwald, Anna Motor, Dean Ouellette, Sharon Reynolds, Gayle Sanders, Karen Shor, Valerie Twombly, and Hannah Vostrovsky. I also thank all the users who allowed us to peer over their shoulders and ask countless questionsyour sacrifice will result in better products. Finally, I would like to thank our management at Cabletron Systems, Inc., especially Bill Tracy, Tom Dennis, and Chuck Black for allowing us to conduct this ambitious project and for giving us everything we needed in order to do it right.
Senior Usability Specialist
Cabletron Systems, Inc.
©1999 ACM 1072-5220/99/0100 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.