TA2000 is the most widely used shareholder record keeping system in the mutual fund industry. Performing the accounting for more than 48 million shareholder accounts, it delivers extensive functionality and support for every facet of the mutual fund industry's full-service and remote operations. TA2000 is a mainframe system with many alternative access points (telephone, Web, workstations). The opportunity to rebuild the user interface of the TA2000 workstation evolved from the executive decision to switch from OS/2 (the current platform) to Microsoft Windows NT. Thus the door was opened to address workstation interaction issues.
In April 1996, one of our corporate officers requested a workstation processing vision brainstorming session. I was selected as the event facilitator. The session, which lasted one afternoon, was designed to list and order client issues involving the integration of our workstation products. The chief information officer (CIO) and his direct reports as well as several senior client staff attended.
We began our discussion by showing video clips of actual call-center processing interactions. Figure 1 is a schematic of typical call-center processing interactions as they are handled on our current OS/2 platform. A shareholder may call to request information or action on an account. The associate interprets the request, accesses or changes the information through one or more TA2000 applications, and responds. The operator then records the processing action using an image-enabled work management system.
As shown in Figure 1, the workstation applications used by operators involved multiple user interfaces. The interfaces were either mainframe (3270 based), graphical (GUI), or both. The Automated Work Distributor (AWD®) is a separate DST product that is used in conjunction with TA2000. Generally, the interfaces differed in their presentation, navigation, layout, and labeling as a result of their respective product development histories. Perhaps a more important issue was the way in which operator tasks extended across applications. Initiating an action in one application often required an operator to remember to finish the task in another application.
The video showed a processor sitting in front of her workstation taking a phone request from a shareholder. A clock was superimposed on screen to show elapsed time to complete the shareholder request, a device that illustrated to the audience the complexity of the desktop. Watching the processor navigate between applications and among windows within an application in an attempt to minimize time spent on the call was concrete in illustrating the meaning of integration and encouraging discussion of the needs of businesses and operators. Some audience members saw an opportunity to make learning easier. Even experienced operators found it difficult to know how to use the current system. Others saw a need to improve the efficiency of data processing. The rest recommended a system that resulted in fewer errors.
The Desktop Vision session produced a number of guiding principles that influenced the design and development methodology of the new interface:
- All DST products will be integrated and have a unified, consistent interface.
- They will have the same "look and feel" supported by common interaction designs.
- Operators will be able to move seamlessly from one application to another without realizing they are working with different products.
- Reduced keyboard or mouse actions will be a goal for every project. Windows will be designed to ease the use of keyboard and mouse.
- All applications will support novice and experienced operators.
- Product or project designs will be driven by work flow analysis. Our project methodology will emphasize iterative usability evaluation.
- GUI corporate standards will shape the development of all operator interfaces.
One might imagine a group of people wandering the countryside analyzing the purpose of gas cans without any concept of a car.
There were many more functionally specific guidelines, and all served to encourage what was at that time a new paradigm for DST's operator interface development processspecifically, increasingly direct contact between the interface designers and the interface users.
Keil and Carmel  described a customerdeveloper link inventory. They used the inventory to support the belief that greater customer participation can lead to more successful software projects. As a successful software vendor for many years, DST has made good use of a large variety of the available links (e.g., facilitated teams, business analysts that mediate between users and programmers, support lines, surveys, user groups, and trade shows). When human factors was introduced to DST, these links were supplemented with more direct links  such as contextual observational studies, operator-interface prototyping, and formal usability lab evaluations.
For one showcase project, the confluence of (1) our decision to adopt the NT platform for our GUI; (2) the desktop visioning for an integrated, unified, and consistent interface; and (3) the introduction of contextual design methods occurred at an opportune moment. The Common Business Object (CBO) team had been laboring to create a set of objects that supported the separation of the TA2000 operator interface from the core business logic and the data. Separation of the two provides for the reuse of both operator interface software and CBOs in a variety of workstation products.
One difficulty encountered by the CBO team was a result of their focus on objects. Since TA2000 exists as a mainframe product with operator interfaces on both 3270 and OS/2 platforms, the team had ample code available from which to abstract business rules and develop objects. However, the team had been working without any NT platform vision for the operator interface layer. Further, the team was working without a direct link revealing how operators used the current system. One might imagine a group of people wandering the countryside analyzing the purpose of gas cans without any concept of a car. It became clear that development of CBOs must be conducted in the context of product use. This required creating a vision of a work flow and its associated operator interface.
May 1996 marked the beginning of the TA2000 workstation design and development effort using contextual analysis techniques. According to the desktop vision approved by executive management, the main criteria for success of the GUI focused on improving the users' work activities. Of course, this required understanding operators' current activities and generating creative ideas for improvement. Initial data gathering techniques consisted of
- Videotaping (showing others what we saw),
- Survey taking (asking users what data and activities they thought were most important within a work flow), and
- Contextual observation (sitting next to a call center associate and recording what we saw and heard).
The technical team also ran mainframe queries to identify which applications and functions on TA2000 were most often accessed. We used these data to identify common workflow sequences (what is done?), workflow influences (why that way?), and roadblocks to efficiency.
Team Members had to learn to ask about and observe work intent and strategy.
The observation sessions represented a new approach for our business analysts (BA). Formerly expert operators, they have been promoted from the processing work force into the development work force. Typical job requirements of BAs are to define and interpret user needs and requirements to technical teams, answer client inquiries about system use, and verify new system functionality. BAs rely on their system experience, system documentation, and a network of knowledgeable experts to complete their assignments. Unfortunately, there is often a significant difference between what people remember about their work experience and how that work was accomplished. Many details of work strategy are often forgotten or are never explicitly realized. In order to integrate work processes, the team needed a way to make the work activities more visible.
In-context work observation became the tool of choice to bridge the knowledge gap and focus on operators' needs. For BAs, the transition from being active interpreters of users' needs to observers of users' actions and strategies was not always easy. Observational sessions require a more reflective, interpretive perspective. We followed an apprenticeship model  in which the operator is the expert and the observers are being tutored. This approach requires a change of attitude and a focus on the intent, strategy, concepts, and structure of the work activity. Team members found it easier to observe and document work practice and structure; they had to learn to ask about and observe work intent and strategy.
To develop observer skill sets within the team, I introduced vendor courses, gave many small group learning sessions, brought new team members along on observation trips, and most important, practiced with the team to improve our observational and analytical skills. Practice involved multiple observation sessions at multiple sites. After each site visit, team members gathered to tell each other the "work stories" that were observed. Timing is essential, because memory of observational details (especially work context) is quickly lost. Early sessions inevitably led to more questions. By iterating observations with analysis, team members improved their observational skills. The more experienced they were in anticipating questions encountered during analysis, the more refined the observations became. As the project grew, so did the number of BAs. Experienced BAs would move on from analysis to design (Figure 2), but there was always enough experience within the core observation team to mentor the new members.
One team member described the analysis process as "tribal." Essentially, the design group sat around a table and told workflow stories. At first the stories were observed work activities. Over time, the stories evolved to generalized work descriptions. During the analytical sessions, workflow scenarios were redesigned if the design team saw a way for improved efficiency (such as eliminating manual steps) or the addition of new functionality. For example, we observed that paper handouts and forms are used throughout operational environments for real-time fund market announcements (fund management changes, fee changes, fund dividend dates etc.). To support this need, we added intranet browser technology to the interface, creating an electronic bulletin board for operational messages.
Analytical sessions illuminated opportunities for system improvement. They also illustrated holes in our understanding of work activities. We used new workflow designs to create use cases for object development and storyboards for GUI navigation design, which in turn were used as scenarios for usability evaluations (Figure 2). As each analytical session built upon another, every team member had the ability to recite a variety of core workflow stories. Group understanding of how work occurred was extremely valuable to the creation of a shared vision.
The design process progressed from workflow observation to workflow design to storyboarding a new navigation structure. Figures 3 and 4 illustrate the vision that resulted from workflow analysis. Conceptually, the new vision simplified the operator interface by collocating data and functions that naturally go together as part of a workflow. We used the workflow analysis to identify these linkages and to identify integration points, that is, places where one task needed to easily flow into another. From an interface design perspective, integration points evolved into navigation pathways or branching controls available in the GUI. We evolved the redesigned work and the high-level UI via storyboarding. The storyboard was iterated using the work scenarios that were observed during data gathering. Ultimately, these storyboards became the UI screen flow or navigation structure.
One example illustrating the benefits of collocating data and functions is illustrated by the account details window. The account details window is the base screen in the application. It presents an overview of a shareholder's account relationship with the mutual fund management company. We designed this window to contain up to 80 percent of the information requested in a typical shareholder phone inquiry. Key data fields on this window evolved from our observation sessions. During each inquiry call, we observed operators reviewing maintenance actions on accounts for recent address changes (important for fraud detection). We also noted that shareholders and dealers often call to confirm recent financial transactions. We collocated high-use ownership, maintenance, and financial data within one window to reduce operator movement or navigation through the interface. Collocating data and functions that are related to workflow garnered many positive responses from operators. Comments from operators who used the interface frequently included these phrases: "I like the accessibility," "It is useful having pertinent information on one screen," "I enjoy that the information is in the same logical order as the call," and "It seems to be presented just as the flow of most calls would be."
Figure 4 illustrates the evolution of a new account setup (NASU). Workflow analysis demonstrated a close link between account maintenance and NASU. Both included the requirement to enter or change account descriptors (joint, single, IRA, etc.), ownership and registration descriptors (name, address, age, etc.), and account options and attributes (bank instructions, systematic investment choices, alternate addresses, etc.). Very often processors would model or copy a new account from an existing account and change it as necessary. The new vision brought the interaction design of the two processes closer together and took advantage of the same window controls and navigation pathways.
Work sequence descriptions became more detailed as we incorporated additional contextual interview data. (See Figures 4 b and c.) The schematics were useful in comparing and contrasting different client work structures. Generally, our analysis noted a similar process regardless of the work environment. In part, this was because all clients were addressing similar customer needs (shareholders and their mutual fund business) and all clients were using the same basic technology (DST's software suite). Still, exceptions existed and were often represented as conditional branches on the schematic.
The design team used the detailed workflow schematics and window storyboards to evaluate the success of the design concept. At first, we evaluated the most frequent path or "happy path" using typical work scenarios observed during data gathering. We modified and re-evaluated the design whenever it could not accommodate a particular scenario. We continued this cycle of design evaluation and modification using more unusual or less likely scenarios. This was a very fast evaluation cycle. Evolution would occur several times within the span of a three-hour design session. The team used white boards and sticky notes to represent design ideas. Designs drawn on white boards are easy to change. We stopped iterating when the interaction design was less fragile and when we could address 80 percent of the work scenarios. Improving 80 percent of the work gave us the biggest bang for the buck and avoided the inevitable "analysis paralysis" that results from designing for every event.
Protecting the integrity of the integrated workflow structure and maintaining consistency as detailed development of windows progressed was a challenge. The number of windows required in the interface was too large to be the sole responsibility of any one person (currently, there are approximately 200 windows). Separate teams of programmers and BAs were assigned responsibility for well-defined functional areas. The BAs were directly responsible for the interaction design. According to the direction of the project team leader, programmers made no creative screen design changes without consulting the BA designer. Early on, the BAs were not experienced window designers. A mentoring process was established to assist them in translating their concept of workflow into detailed window designs. To develop design skills within the team, we again introduced vendor courses, gave small group learning sessions, brought new team members into ongoing design sessions, and, most important, formed a core GUI design team.
A mentoring process was established to assist the BAs in translating their concept of workflow into detailed window designs.
The membership of the core design team included myself, for human factors skills; two BA workflow experts; an application programmer; and a rotating (new) member. The GUI design team produced the high-level window designs and determined the major navigation controls. The details of these high-level windows were then completed by the responsible BA (the rotating member). Learning how to design was a shared experience in learning by doing. On any given day, the design team would review details of a completed detailed design for workflow and window consistency, develop a high-level design for a new set of functions, or participate in a work model analysis session for future functionality. Through consultation, advice giving, direction, and mentoring the GUI design team instilled in each BA the insight, skill, and confidence to own the operator interaction design.
Once a UI structure was established, the team developed high-fidelity, detailed window designs in a prototyping tool. Having detailed window designs available helped the team discuss window layout, control, and labeling issues. One of our goals was to reduce training time, so the team was sensitive to DST's technical jargon, and where practical, replaced it with more understandable terms. We decided to keep the high-fidelity prototype code distinct from our application code. The prototype was used to explore and evaluate UI concepts and ideas. Keeping prototyping and application development separate, reduced-the-all-too-frequent temptation to implement an incomplete or untested design idea. As the interface concepts evolved in detail, the fidelity of the prototype increased to become fully interactive. As described by Rudd, Stern, and Isensee  we took full advantage of our low- and high-fidelity prototypes. Low-fidelity prototypes (paper screens) were used to evaluate design concepts, and address structure issues. The high-fidelity prototype was used for evaluation, served as a living specification, and was used repeatedly to demonstrate the interface concept. The project team gave multiple demonstrations to new and existing clients several times a month throughout the development phase. Demonstrations served as another source of feedback and the responses we got helped keep us focused on the needs of the user community.
At critical points in the design process, we sought feedback in order to measure our success. We followed the same strategy that was followed on the initial data gathering effort. We had real users perform typical work scenarios on paper versions of our window designs and later on the more realistic dynamic prototype. The evaluation scenarios were generated from the workflow design. The same mentoring approach to train the BAs was used during the evaluation cycles. BAs were taught to use the same data-gathering techniques during usability evaluations as were used in the field before design. In many cases we also recorded operator interactions in order to conduct more formal usability data analysis. We often showed clips of these videos to the project team in order to communicate both our success and challenges. Our ergonomic success criteria were applied to the evolving design to ensure that we were improving the operator interface through
Clients were also given our high-fidelity prototype to "try before you buy" in their own environment.
- Increased efficiency, measured by counting steps in the process, number of windows, number of keystrokes or mouse clicks, amount of time to complete an action.
- Improved quality or fewer errors, measured by the operator's
1. Anticipate the next step in an activity (what are you looking for?, what do you think will happen next?) and
2. Detect and recover from an error.
- Reduced training, measured by number of trials to performance criterion
- Satisfaction, measured by rating scales on perceived efficiency, learnability, and task flow acceptability.
Our client partners also provided feedback on the success of the design. Through the first two years of design and development, project management held regular monthly meetings with key business clients. The clients helped put in priority order and evaluate our analysis and design decisions. Perhaps most important, they were exposed to the user interface design methodology and development success criteria. This was particularly helpful during the usability evaluation sessions because clients would volunteer to send their processing personnel to our facility to get hands-on experience with prototype versions of the software. Clients were also given our high-fidelity prototype to "try before you buy" in their own environment. These participatory experiences encouraged clients to support our efforts to introduce contextual design and analysis methods throughout the DST development organization.
DST is a technology service company. In that role, it must be responsive to its clients' requests for modifications and enhancements to its software and services. Clients' requests are communicated through management and can be quite remote from direct-user experience. For many months, the team had Neilsen's [3, p. 14] advice posted to the door of our analysis room:
Vice presidents and other corporate executives should realize that they are no more representative of the end users than the developers are.... [Vice presidents'] "intuitions about what would make a great design may not be accurate."
Confronting powerful intermediary customer links from within a service company environment is challenging. At times, despite whatever workflow-based evidence or analysis we had, sometimes we adjusted our designs to meet the goals of executive management rather than our perception of users' needs. Competing priorities is a typical phenomenon in all development projects. Our relatively novice BA team was challenged in knowing how to question and evaluate particular interaction design change requests.
As in any development effort, time constraints are intrinsic. We often sacrificed formal work practice modeling and documentation. As a group, we used whiteboards to quickly iterate and evolve our work models into high-level storyboards. The storyboard screen flows encompassed both the workflow decisions and the window designs. We then erased the whiteboards. All further iteration was done using the storyboard screen flows. We regret that sacrifice. As other teams join in the continued development of our desktop, they are forced to learn the workflow structure by examining screen flow structure. Understanding the rationale of why a design is presented in a certain way is difficult to ascertain from the window structure. As team members move on to other projects, the detailed workflow knowledge they have attained will be lost. To mitigate the loss, we have initiated an oral history process to recapture the design rationale. We are recording our BAs talking about user workflow and how it is realized in the user interaction design. We will organize these recordings into a scripted presentation and share it with new teams that are adding functions to the desktop. We hope that a historical perspective on user workflow and how it is incorporated into the design will assist new teams in maintaining the integrity of the user interface.
We have initiated an oral history process to recapture the design rationale.
Change is difficult. The introduction of field research methods, explicit workflow design, and usability evaluation was new to the DST development enterprise. Many times we were asked "What makes your project so different?" Our answer was "the use of rapid design iteration and explicit methods to understand user needs." Success on one project, however, does not automatically lead to acceptance of a methodology on other projects. There are lots of reasons to resist change. It is common to hear someone say, "I have been designing screens for 10 years; I know what is best." People are not always comfortable accepting that they don't know what the user wants. Also, project success criteria may dictate that schedules and functionality are more important to the success of a development effort than operability (i.e., human factors issues). To reverse that argument, operability (work) issues must be made visible to the development manager early in project development.
Before there was a desktop concept or vision, work practice observations had already been initiated. These observations informed our executive management and encouraged discussion of training time, processing efficiency, error correction, and quality of work. Putting work practice at center stage from the beginning of the development effort made operability and usability characteristics key success criteria for the interface.
Multiple opportunities to observe users and refine the observation and interviewing process were available because of the large number of client sites. Clients were always eager to have us visit to see how their operations were unique. We found that iterating between analysis and observation cycles provided a learning opportunity to refine the team's data gathering skills. During the early group analysis sessions, I found that analysts were good at answering "how" and "what" questions but not always "why" questions (e.g., "Why did the operator open a batch queue before processing the transaction?"). Knowing the answer to the "why" question allows designers to support the user's work strategy. Our BAs needed to learn that lesson; cycling between observation and analysis helped. It seems easier to appreciate the benefits of knowing a user's work strategy if you have experience in applying the knowledge during design.
Demonstrations to our clients and potential customers were invaluable sources of feedback on our ideas and inevitably garnered feedback about our approach and methodology. We found that clients were delighted that we were taking a customer-centered design approach. Once they understood that our methodology placed their operators at center stage and that we wanted to understand the efficiency and training roadblocks, they were eager to have us visit their processing environments.
From the beginning, we had a small design team with a variety of skills. Business, software, and human factors skills provided a variety of perspectives. The small size was easier to mentor and easier to organize. As the development team grew, we kept a small, consistent core design team and had additional rotating members. Members joined or left as the system functions they were responsible for became relevant or were "finished." The small team was generally aware of all window designs and was an effective force for maintaining the integrity of the interface. Having rotating memberships broadened the skills training beyond the core team and maintained a high degree of historical knowledge.
Perhaps the part of this project with which I am most pleased is the success of the mentoring process. I began with a BA team with no prior usability engineering experience and limited development experience. Learning by doing was a process that was emphasized by the project management team. As a result, in addition to the new desktop rollout at year end, the company has a group of BAs that are sensitive to users' needs and can apply various methods to respond to them during their next project in software development. The value of the entire process, from field observation to work analysis to interaction design and evaluation, was demonstrated by the actions of the entire analysis and design team. Working and learning as a group allowed us to leverage our skills and develop a "trusted coworker" atmosphere. Mentoring was an important piece of our success because it encouraged delegation and ownership of the process. Rather than having one person responsible for "usability," the entire BA team learned how to appreciate and control the consequences of their collective user-interface design decisions.
I would like to thank project leaders Paul Lyons and Bridget Wagstaff for their collaboration and support throughout this project. I thank Rick Guess, Suzanne Ryan, and Mike Willigan for contributing to the workflow descriptions contained in this paper.
DST Systems, Inc.
1055 Broadway, BW09
Kansas City, MO 64105
All trademarks mentioned herein belong to their respective companies.
Figure 1. Call center processing schematic. Customer
contacts call center associate, and associate accesses several
TA2000 applications to address customers' requests. The Automated
Work Distributor provides image routing capabilities (e.g.,
letters, forms) and manages the work flow across the call center.
Applications are all available through the same imput-output
devices (keyboard and mouse control); however, each has its own
interaction style, and there are few supportive connections
(e.g., operator must re-enter data, interactions in one
application may not reflect actions taken in another
Figure 2. The project methodology incorporated both the
elements of Object Oriented development and the human factors
process. To date, this has been a 2-year project with more than
600,000 lines of code generated and a development team size of
more than 40 people. At any given time, all phases of development
(data gathering, analysis, design, testing) occurred
simultaneously for different functional deliverables (inquiry,
maintenance, financial transactions, and new account
Figure 3. Our vision for the new call center processing
schematic. A customer contacts the call center associate. The
associate accesses a variety of applications to address the
customer's requests but is unaware that separate TA2000
applications exist. Data sharing and cross-application workflows
are supported. Automated updates to AWD are supported.
Interaction styles across functions for form filling, file
maintenance, inquiry and processing functions are consistent and
support fluid transitions within and between work
Figure 4. The evolution of a new account setup activity.
(A) Observed workflow steps from one operator at one particular
client site. Several applications are in use. Several manual
actions are also represented. (B) The linear workflow sequence is
abstracted to several simple steps. The abstracted NASU sequence
is not unlike creating a new document in a word processor.
Operators are motivated to make use of any information that
already exists. Copying from an existing document and modifying
it as needed keeps information in synch and reduces re-entry. It
is also faster than starting from a blank form. (C) The workflow
schematic is elaborated and enriched with detail in order to
handle exceptions. It now contains conditional branches. (D) The
beginnings of a storyboard. The workflow is mapped to a schematic
window sequence or user environment design
. The application
image contains the source material. Base screens for the
application are the Search and Details windows.
©1999 ACM 1072-5220/99/0100 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.