Help! User assistance and HCI

XIV.1 January + February 2007
Page: 32
Digital Citation

Overcoming a common help design challenge


Authors:
Doris Holloway

How do you design user assistance for an audience to whom you don’t have access? The biggest challenge and largest gap in our design process was lack of interaction with the users of our Help products. Our Fortune 500 Company provides software products and services to the financial services industry, and we are a group of user-assistance developers tasked with documenting the products for one market segment. While we had access to internal customers (product strategy, design, development, customer support, and implementations), we wanted to solicit design input from external users who were extremely diverse in their information needs, knowledge, and skill levels. We faced that challenge as we designed a Help system for a new Web-based software product our company was developing.

In this case study, I focus on our evaluation of Help content and our evaluation of the overall usability of the Help system.

Roadblocks to User-Focused Help Design

Our company launched a new generation of software with the new Web-based product and made significant changes in its software development processes and system architecture. The new process focused on user-centered design following a Rational Unified Process (RUP) development methodology. Although we had solicited requirements from users in the past, our new processes called for extensive user input in the design phase. Customers were brought on site for joint application development (JAD) sessions to provide input into the system design. We placed great importance on the Help design process including the same opportunities for user input, but our Help development team faced several challenges illustrated in the table on the opposite page.

Building Better Help Content

Developing Help for the new Web-based product provided an opportunity to build a better Help system. The Help content we designed in the past occasionally fell short of customer expectations in some regard, and we had never involved external customers in the design of any previously developed Help systems. While we provided avenues for external and internal customers to provide us with feedback (feedback forms, email addresses, fax numbers), we received little user feedback. However, based on the feedback we did receive from customers, we developed several objectives for the new web-based Help content:

  1. Easily accessible content. Our users were frustrated with the poor information architecture of existing Help systems:
    • The structure and organization of the Help content made it difficult to find information.
    • The search feature did not allow a full text search.
    • The information architecture bore no relationship to the application or the tasks the user performed.
    • The information architecture provided few clues as to where different types of users could go to find the information they needed for their specific role or function.
    • Proper business terminology was not used, making it difficult to find content.

     

  2. Comprehensive content. Our former Help systems
    primarily focused on the tasks that users performed to complete a
    screen. This level of content addressed only the needs of the
    lowest-level users; it did not provide information needed for
    diverse audiences, especially those who required detailed
    information to conduct research or troubleshoot issues. We needed
    to broaden the scope of information we provided in the new Help
    system.

  3.  

  4. Help design matches application design. While most of
    our previous Help systems were integrated into the software
    products, the visual design of the Help and application were
    developed independently. When a user launched the Help system, it
    bore no design relationship to the application. We wanted the new
    Help system to appear as a fully integrated part of the
    application with background colors, banners, fonts, graphics, and
    other visual-design elements matching.

  5.  

  6. Web look and feel. We wanted the new Help system to
    look and feel like a Web page since it would be integrated into a
    Web application. We didn’t like the "Help system" look
    and feel that most Help authoring systems generate, with rigid
    information-design structures and limited visual-design elements.
    As a result, we chose to "hard-code" the first few
    layers of the Help system to provide more flexibility in visual
    design and accessibility.

  7.  

To accomplish these objectives, we developed a design strategy that would:

  • solicit customer feedback throughout the Help design and development process
  • use internal and external users to validate the initial Help design prototype before we began building content
  • use external users to validate the content and ensure it met all their information needs
  • obtain hands-on input from external users on navigation and accessibility

Leveraging the Knowledge of Internal Users

In developing an initial Help design and prototype, we leveraged the knowledge of our internal users from product strategy, design, and development to identify the information in the table below.

See table.

We used the design documents that were created in the JAD sessions with user input to identify the business processes and tasks that users would perform in the new application. We then developed an initial prototype, which our internal users validated. The prototype provided the first few "layers" of the Help system including TOCs, search functionality, navigation devices, and a basic Help-page model. Our internal users provided answers to questions we had on the information needs of the users. This invaluable feedback helped us refine the information architecture and present a better-designed prototype to external users.

External Users Validate Help Prototype.

Before we authored content, we needed external users to validate the initial Help prototype and wanted to provide them a first peek at the new Help system design. The product-strategy group understood the importance of external customer feedback and was willing to share some of the customer’s time with us. They coordinated a review of our prototype with a group of customers they were bringing in to review requirements. Our plan was not only to solicit feedback on the new Help system design, but to also gather information about our users. We provided the users with a paper prototype of the new Help system as well as a 50-question survey used to gather demographic data on:

  • Who, how often, and why they used our Help systems
  • What information they looked for or expected to find in Help systems
  • Specific questions on the paper prototypes

The customer review yielded the following results:

It confirmed the diversity of our user base. User responsibilities varied from very narrow functional focus to users with a wide range of responsibilities. Overall, users performed a variety of tasks including routine data processing, reporting, troubleshooting, implementation, and administration. Additionally, our users’ experience levels varied from little or no business experience or knowledge of our applications, to users with extensive industry experience and system knowledge.

It provided insight on what types of information users looked for and why they accessed our Help systems. These included learning how to perform a task they were never taught, tasks they were taught but forgot how to do, and finding troubleshooting information.

Users indicated they would not use an index. Our initial prototype included an index feature along with a TOC and Search functionality to aid in navigation. Our customers indicated they would rarely use the index.

Our initial Search design was poor. The Search results did not provide enough distinguishing information to guide users in selecting the correct result. The users also wanted advanced search features and a ranking to indicate search-content relevance.

Too many clicks frustrated users. They wanted hyperlinks to get them where they needed to go quickly with a minimum of navigation attempts.

Users wanted "see also" information to point to further sources of information. They wanted to access other information that would be useful when they didn’t understand something fully or needed more information.

We needed to clearly define terms and use them with care. Roles and functions were sometimes referred to differently at different user sites.

As a result of this feedback, we

  • improved the Search design and functionality
  • set standards for number of clicks to get to a Help topic
  • added more information to the See Also section
  • defined roles and added these definitions to a glossary and to help topics that contained information specific to those roles.

The analysis of the demographic data further validated the diversity of our users and the need to provide content for different skill levels and knowledge needs.

Does the Help Content Meet the Needs of Diverse Audiences?

After we developed a sound basic design, we created the Help-system content. Our primary sources of input were RUP design documents, the developing user interface (UI), and feedback from internal users. We needed to ensure that the content met the needs of diverse audiences.

Before being released for general availability, the new product went through a user acceptance process (UAT) with one of our customers. During the UAT process, an implementation team worked with the customer to test the functionality of the application and to validate that the application UI matched the system-design documents. We took the opportunity to team with our implementation organization to perform UAT on the Help system as well.

Although we solicited content feedback from only one customer, this UAT customer represented many different skill levels and information needs. In addition, the implementation team itself identified and represented the information needs of our internal users when providing feedback on content.

While a UAT often provides only a superficial review of Help, the new process ensured that the customer and/or our implementation staff reviewed the entire Help system in careful detail. Because we were inserting ourselves in a challenging and extensive UAT process on an entirely new product, we limited the scope of what we asked users to test with the Help system. The UAT process focused on the reliability of the Help content, not on usability. The types of feedback we received are outlined in the table below.

See table.

One Help design objective included the development of comprehensive content beyond procedures and field definitions. Though we developed some of this content, the UAT process further identified information gaps in Help when we identified where users were struggling with understanding how to use certain areas of the new application. The implementation and design teams drafted documents that included best practices, process and data flows, and troubleshooting tips, which were added to the Help system. The implementation team also logged open issues with the application. We reviewed and used the logs to identify areas where we could provide content to help users better understand how the application worked and how to use it.

The major benefit of taking Help through UAT was identifying content gaps. The UAT customer’s careful, critical, and thorough review of the Help system improved the content for all future users. By partnering with the implementation team, we better identified the information needs of our diverse audiences.

Testing Usability of the Help System

UAT provided feedback on the Help content but not feedback on the usability of the Help system. A primary Help design objective included providing easily accessible content, and we wanted feedback on the redesigned search functionality and overall content accessibility.

We decided to conduct usability evaluations at our company’s annual users’ conference, using customers as participants in group evaluations. In the few usability tests we performed in the past, we tested one participant at a time. We were concerned with conducting a test with several participants from different companies and backgrounds and unknown skill and knowledge levels. However, we found the interaction among the various users beneficial and productive. We asked everyone to think out loud, which sparked discussion and feedback we didn’t typically receive with one participant testing. Our participants included 28 customers from 15 companies in ten usability sessions.

In our test sessions, we had access to the application and Help system but were challenged with several other restrictions:

  1. We couldn’t schedule our usability testing during other sessions or user-group meetings, which limited us to 30-minute time slots between sessions and short timeframes before the sessions began and after they ended.
  2. We had access to a meeting space for only a day and a half. This issue, along with the 30-minute restriction, narrowed the number of testing opportunities.
  3. Although we had access to a computer, and users would be able to do hands-on usability testing, we did not have access to a formal usability lab.

Our test strategy included:

  • Keeping the usability test short and the participants on task. We limited the scope of our usability test to nine questions to identify:
  • Could users launch Help from the application without assistance?
  • Could they navigate to Help for a specific function without assistance?
  • Were procedures detailed enough for them to perform a task?
  • When asked to find information on a specific topic (field definition, process flow, conceptual information), could they easily find it? We further asked them to provide feedback on the usefulness of these various topic types.
  • Would they intuitively use the Search feature to look for information?

(We also identified whether or not the user successfully performed the task we asked of them, how long it took, how many tries, how they found the information, customer observations and comments, and any additional observations or comments we felt were appropriate.)

  • Testing several customers together in each 30-minute time slot.
  • Performing a simple usability test with one person asking questions and one person recording results and times.

Findings from our Group Usability Sessions

These usability sessions yielded the most significant improvement suggestions to the Help system of all the testing performed.

Improve the display of topics in the related information table. Our Help model included a table with task information on one side and See Also information on the other. Customers went directly to this table to find the information they wanted. They made several suggestions for making this information more accessible:

  • Place a border between the two columns to better separate the information.
  • Always include field descriptions and system messages at the bottom of the See Also table since the users frequently access this information.
  • Alphabetize all other information for ease of access.

Ensure that users can access all important information from the See Also table. Customers did not read overview information. Our Help model began with overview information that included hyperlinks to process flows. We found that in almost every test, when asked to find the process flow, users completely skipped the overview information and searched for this information in the Tasks/See Also table that followed the overview. We realized that we need to place the most important information in this table. As a result, we moved the process-flow information to the See Also section. While we did not remove the overview information, we learned not to include hyperlinks or other relevant information in that section of the Help topic.

Improve the flow diagrams. Our Help model included flow diagrams, some multiple pages long, which split into two or more files to make printing easier. Customers suggested we create a "printer-friendly" or PDF version for printing and not split the HTML file into multiple files for multipage flows. The customers also suggested adding color to the flow diagrams (green for start and, yes, and red for stop and no) to help users more easily understand the flow.

Further improve search access. While users liked the improved search feature, which now included rankings and advanced search, they found the link to Search too small and had difficulty finding it. They suggested we make it more visible; we did so by increasing the font size and changing the placement.

Provide revision dates. Customers wanted a revision date on the Help topics so they could identify the last time the information changed.

Much of the feedback received from this exercise would never have occurred to us, and we implemented relatively easily. We had become so familiar with the Help system that we were surprised at the design features over which our users stumbled. This experience was a humbling reminder of the benefit of usability testing.

Getting Feedback After Product Implementation

Customers suggested that we add the ability to email a Help topic. Many users provide troubleshooting support for their companies, and wanted the ability to email other users a Help topic versus printing it or copying the information into an email. We also added the ability for customers to send us feedback on each Help topic and rank the usefulness of that topic.

Post-Implementation Notes

After the company implemented the application at several customer sites, we continued to receive feedback on content through the Email This Page and Was This Information Helpful? features. Our internal and external customers find these features easy to use, and we find that this ease of use encourages feedback. The types of feedback we receive are typical of the content validation received in UAT testing and serve to ensure accurate and complete Help content. Additionally, we receive comments indicating users can’t find a specific topic, enabling us to improve navigation.

Lessons Learned

In this first experience including users in the Help design process, we learned the following lessons:

Clearly identify testing needs up front. By identifying specifically what we wanted to test, we limited the scope and scale of testing and better matched our testing needs with opportunities for access to the right customers.

Team up with other departments. Because we were competing with other areas for limited access to users, we found opportunities to team with other departments who brought in customers or worked at customer sites. By limiting the amount of time we needed to spend with the customers and the number of requests, most groups were willing to give up some time for us.

Work with many different customers. Often with Help development, we may interact only with one beta customer. We found that working with many different customers helped us better identify and address the needs of very diverse audiences. And when many different customers provided the same feedback, it reinforced the areas that needed improvement.

Allow customers to interact with one another in testing sessions. We were surprised at the level of interaction between customers during the usability testing at our users’ conference. Having customers from different backgrounds with different levels of knowledge and skill interact with one another proved to be a catalyst for generating ideas and discussion. It encouraged the customers to talk about their different business needs, their different reactions and perceptions of the Help system, and brainstorm suggestions that would improve the Help system.

Provide easy ways for customers to provide feedback. Web-based Help systems make it easy for customers to provide direct feedback on specific Help topics. When we first implemented the Email This Page and Was This Information Helpful? features, we were concerned that we would open up a floodgate of criticism or irrelevant feedback or that no one would use it. To the contrary, we found that both internal and external customers are using these devices to provide specific and valid feedback. This feedback helps us continually improve the content of the Help system.

By involving customers, both internal and external, in the design of the Help system, we developed a much more usable Help system. Our internal customers (product strategy, design, development, implementations, and customer support) lent their expertise to help us design content that met the needs of very diverse audiences. In working closely with these organizations through the design, development, and implementation of the new product, we identified customer-information needs that were added to the Help system and that created more robust content. Our external customers provided invaluable feedback on usability, specifically access to information and navigation. As designers of the Help system, and thus as users who were very familiar with the design, we found it difficult to identify usability issues. By seeing where our customers stumbled, we made fairly minor modifications to our design, which greatly increased the usability.

Author

Doris Holloway
Fidelity National Financial, Inc.
doris.holloway@fnf.com

About the author:

Doris Holloway has 20 years’ experience in user-assistance development and currently leads a team of user-assistance developers at Fidelity National Information Services, Inc. in Jacksonville, Florida. Doris has an MBA from the University of Florida.

Tables

UT1Table.

UT2Table.

UT3Table.

©2007 ACM  1072-5220/07/0100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2007 ACM, Inc.

 

Post Comment


No Comments Found