Back to school: HCI & higher education

XII.5 September + October 2005
Page: 19
Digital Citation

Discovering user information needs


Authors:
Frank Ritter, Andrew Freed, Onida Haskett

University department Web pages are the focal point for prospective students, current students, parents, staff, and alumni who want to explore the university. Users visiting these sites expect to find the information they seek, perhaps most notably contact information for various people within departments, but also a wide range of information related to a specific department.

University department sites currently have a wide divergence in styles and content. This could be attributed to a difference in department philosophies and the range of tasks each department must support. Content will also vary simply because different departments have different information to present, and some may put more or less effort to their design. They appear to vary in some cases because they are designed without a plan of what to include.

There are likely, however, numerous common types of users and tasks that all university department sites should support. We believe a task analysis (some would call this a content analysis) is the first place to start to create a successful department Web site. Our task analysis is a set of tasks that could be supported for university department Web-site users.

We present a task analysis of user groups and what users look for on university department Web sites. We developed this through a wide range of analyses, including reviewing existing department Web sites, departmental hardcopy handout materials, search-engine queries, and by interviewing users to see what additional information they require. The list of user groups and the list of tasks is likely to be difficult to keep in mind, and would be difficult to generate alone in a single setting. That is one of the lessons of this analysis.

While most directly usable to help with department Web sites, it is reusable by others. This analysis is generalizable and can be modified for use on other types of Web sites, including nonprofit, corporate, e-store, or university athletics sites. For example, we have used it to design a nonprofit’s Web site.

We compare this task analysis to a sample of current department Web sites to show that it generates useful suggestions, as well as using the comparison to find tasks that we missed in our previous analyses. We conclude with a guide on what to do with department Web sites after they are built, including maintenance based on this task analysis and marketing. Please note that this report focuses only on task analysis, and does not cover design elements of Web sites. Refer to other resources on Web-site design to apply our task analysis (e.g., [3] and [9]).

The Types of Users. Table 1 presents a listing of the types of users that we are able to enumerate. Some users will fit under multiple categories (this is a problem that we have not tried to detangle yet). This list is likely to be incomplete still, but provides a wider range than we ourselves have thought of on any one occasion, and is now more complete than our department used on its first Web-site design. As department Web sites are developed, it would be productive to keep these user groups in mind. Further analyses of other sites can, of course, expand this list.

We began our task analysis by looking at a sample of existing department Web sites and extracting a list of tasks that they support. We initially visited two department Web sites at Penn State—the School of Information Sciences and Technology (IST) and the Department of Computer Science and Engineering (CSE). We examined these Web sites for tasks they supported and noted these tasks in a list. As new tasks were discovered with other approaches, the list was augmented, and served as the initial draft of the task analysis. (The complete task listing is presented later in Table 7 if you wish to skip ahead.)

Solution Summary—Task Analysis Overview

Task analysis refers to a family of techniques for describing various aspects of how people work (e.g., [1], [6], [13]). Task analysis provides a deeper understanding of the goals people are trying to achieve. It offers an approach for overcoming the challenge of correctly acquiring the essence of the user by defining their tasks, and in this case, the information they will wish to acquire from a university department Web site.

Task analysis should be used during the design process because it acts as a road map for a design team. In each portion of the design, the task analysis can be used as a guide to answer the question: "Does the design support the users’ tasks?" With a complex set of tasks, it will be useful to enumerate them and refer to the list during design and redesign.

In order to begin task analysis for a Web site, there are three fundamental steps to follow. Initially, the designer needs to know what groups of users will be using the site. Next, they need to consider what information the user groups will need to access, creating a list of tasks that different users will perform using the Web site. Finally, the designer can note the pages most frequently viewed by the users and the tasks that these pages accomplished, and modify their design to make the most important or common tasks easier to do.

In order to build a useful task analysis, we must determine what information the users are looking for. Typically, users are studied directly, and formal manuals and processes are used to generate a normative model of use. The users in the case of university department Web sites are diverse in many ways, including being geographically dispersed. Therefore, a wide range of approaches are needed to enumerate their tasks.

Solution Summary—Our Task Analysis

To generate our initial task analysis, we used a variety of methods, including analysis of departmental hard-copy materials, analysis of existing sites, reviewing Web search-engine logs, and asking existing users. Further details are available [10].

Hard Copies. One way to find out what information should be on a Web site is to look at other media. We started out by collecting existing hard copies of information: an informational packet from the School of IST directed toward prospective students, as well as a graduate brochure from the Computer Science and Engineering (CSE) Department. The following types of information were encountered: Web-site printouts, printed brochures, pictures, and directories. The informational packets each contained nearly 150 pages of various materials.

The hard-copy materials supported most tasks that a prospective university student requires, such as admissions information, campus information, faculty listings, and an introduction to the given program. The hard-copy materials displayed significant parallelism to their respective Web sites. This analysis started our table of tasks.

It was interesting to see Web-site printouts in the hard-copy materials. This tells us that there is significant overlap between the hard-copy materials and the Web sites supporting them and that handout designs may be influenced by the Web site. From this we conclude that hard-copy materials and department Web sites should be designed to work together.

Web-Site Search Queries. Examining queries to a site provides another way to discover what information users want from that site. Queries show us the information that users are looking for, particularly topics that they have difficulty finding on a Web site. A search query tells us that the user wanted information on that topic, and it suggests that this topic was not easily found using the existing site. Or maybe a user knew exactly what he wanted to know, but didn’t want to navigate through many pages. Search queries can be faster than hyperlink navigation, especially if one is using sites that do not provide information in the structure users expect.

We examined search query logs of Penn State’s home page (www.psu.edu), provided to us by the Penn State Web master to augment our analysis. We compiled a listing of the 250 most searched for phrases in the search logs from the spring 2002 semester. The logs represented over 1,000,000 search queries.

The queries were typically short phrases. We classified the top 100 phrases into three categories, and of these categories we list only the most popular searches in that category. The largest category was phrases related to registrar functions. These tasks are supported by the registrar in the US colleges and by departments in the UK and other cultures. Table 2 provides a listing of these tasks. All of these items are provided by a centralized registrar’s department in the US, so we do not include them in our task analysis. In other countries, such as the UK, several of these topics would be relevant to include on a departmental Web site and should be included to create a more local task analysis for such sites.

The second group of searches was for colleges and departments within the university. These results indicate that many users were looking for department and college Web sites. Ideally, users coming to a university’s Web site would be provided help to find department sites.

We have noticed on multiple universities’ Web sites that finding departments can be a problem. A department’s Web site cannot directly address this task. University Web masters will have to assist in this. The department Web master might be able to ensure that their site is correctly listed, so that it comes up in search engines, particularly the one provided by the department’s university. We know that this has not always been the case at our own university, but that it has improved since these logs were taken. A secondary explanation might be that how users view the university and how it views itself may differ, and therefore a task and needs analysis based on users is likely to be useful here.

The next group of queries is shown in Table 3. This list is of topics that department Web sites might reasonably be expected to include, including many queries related to finding courses. Users were not finding this information directly from department (or university) Web sites, or preferred to use a search engine to find it. We added to our task analysis the equivalent of the items in Table 3.

Departments can fruitfully monitor the most common items searched for in their Web site. This listing will suggest what topics are hard to find and what topics are not yet included.

User interviews. We interviewed 13 users of university department Web sites. These users included current students and prospective students (eight), as well as staff (one), parents (three), and alumna (one). We showed them our preliminary list of tasks, and asked them to tell us what additional tasks they thought should be supported.

All of the interviewees specified that they wanted contact information on a department Web site. They wanted phone numbers and email addresses for a wide variety of people in a department. This was already included in our list, so it indicates a strong desire for this information.

The interviewees also came up with the following new tasks to support: schedule for finals, local information (weather, etc.), and intramural sports related to the department (which we read more generally as "social organizations and clubs"). While more users could be interviewed, the last six users could not provide additional tasks.

The full task analysis. Table 4 shows the complete list of tasks. Note, some topics are listed twice as they are grouped that way by existing designs, or, more importantly, by users. Remember that this list is to be used as a guide; particular departments’ Web sites may not require all of these features, and some departments may also want to provide information. For example, a community college might not need to include information about graduate programs. Some schools might not have internship programs. Additionally, there may be information you want to display on your Web site that is not on this list. It is, however, intended to be a fairly complete list, and to be useful for checking designs.

We can now imagine checking Web sites to see if they provide this information. In the next section, we do this by hand to test its usefulness, as well as a way to extend the task analysis.

Testing Our Analysis

To test our information needs/task analysis, we examined several Web sites in detail to see how many of the tasks they supported. We expected to discover if sites already supported all the tasks, and we also hoped to find some further tasks.

We selected Web sites that sampled several domains. First of all, we tried sites from three different universities—Penn State, the University of Illinois, and Rutgers. Secondly, we chose a range of disciples within these schools—Information Sciences and Technology (IST), Psychology (Psy), Electrical Engineering (EE), and Business (Bus). Each of the sites are well-done Web sites. They use slightly different designs.

We visited each site (in June 2002) and determined if it supported each individual task, marking the corresponding table entries "yes," "no," or "not applicable" as appropriate. (These sites have changed since then.) It was always possible to tell if the task was supported because the sites were all well organized or small. The results are shown in Table 4. Occasionally we would also find new tasks at these sites. These we recorded, checked the other sites for them, and included in Table 4 (as indicated with the asterisks). The lack of an item on a Web site may indicate a place for improvement or it may indicate a difference in focus of the department of Web sites, or it may simply not be applicable.

The first Web site in Table 4 is Penn State’s School of IST Web site (ist.psu.edu). This is probably the newest site to be built on our list, and is one of the most complete. The IST Web site covered 78 of the 90 tasks.

The next column in Table 4 is the Penn State Psychology Web site (psych.la.psu.edu). This was a simply designed Web site full of features. It appeared to be designed with accessibility in mind. The Psychology Web site covered 46 of the 90 tasks. Some of the tasks might not be supported because it is a site for a department and not a larger unit such as a school.

Our final Penn State site was the Electrical Engineering department (www.ee.psu.edu). This site covered 56 of the 90 tasks. This analysis provided suggestions for topics to consider including on their Web site.

The Electrical Engineering site at the University of Illinois was our next stop (www.ece.uiuc.edu). This site covered 62 of the 90 tasks. It also provided some interesting new tasks, perhaps because it is a large and prominent department.

Our final site was the Rutgers Business School (business.rutgers.edu). Due to its size and stature within its university, it is probably most comparable to the IST site, as they are both schools, a larger academic unit than a department. This site supported 46 of the 90 tasks. The analysis makes several suggestions for where this site could be expanded to support more user tasks.

Comparing the lists of tasks to these five Web sites by hand led us to add seven new tasks, in addition to explicit descriptions of where these sites could provide more information. A wide variety of tasks were supported on all sites, showing an emerging agreement and commonality in many aspects of Web-site design. There were also interesting differences. This suggests that different university departments have a different view about what is important to include on a department Web site.

Implications for Web-Site Design. There are implications of this task analysis for Web sites, including maintenance and the broader context of supporting users, including dissemination so that potential users can find it.

Dissemination. One of the first topics to consider is sharing the material in Table 4. Fortunately, a wide variety of free and commercial services exists to help advertise a department’s site, and there are useful ways to maintain a site. One of the most efficient advertising methods is to get one’s site listed on search engines [4].

Listing your site in search engines manually is a simple but time-consuming activity. Most search engines have straightforward submission forms, where you enter your Web site’s URL and a few bookkeeping items. The difficult part of manual submission is finding the submission forms themselves. Therefore, Table 5 lists several of the most popular search engines and the URL for submitting sites to them. There are also many tools that will automatically submit Web sites to search engines. These services vary in cost and coverage, with costs in the range of $0 to $1,000+ per year, and coverage from four search engines to (reportedly) over 400,000 search engine entries.

Posting a Web site to search engines represents only one step in a successful advertising journey. "As a whole, the World Wide Web displays a striking "rich get richer" behavior, with a relatively small number of sites receiving a disproportionately large share of hyperlink references and traffic" [8]. If a Web site is not listed near the top of search results, it will not be visited. Visits to one of the author’s personal sites increased 600 percent when the site was listed in the top ten sites of a directed reference site. Therefore, Web sites and all of their component pages should be optimized for search engines.

One can consult a range of sources for optimization tips that will help boost a Web site’s search engine ranking. These tips instruct a Web designer how to structure the content of their pages to take advantage of how search engines look for information. The tasks in Table 4 suggest that you will need to share this information widely among the developers of these component pages.

Another method of site advertising is directed banner ads placed on other sites. Previous work [4] suggests this might not be as productive as search engines, but it may be useful for new sites. For university departments, these may include alumni links, student links, and links to and from faculty sites.

Users, however, are unlikely to be widely enthusiastic about academic departments’ either displaying or generating banner ads themselves. It may be more appropriate to provide links to related degree programs, providing context in a more subtle way.

Departments already are repositories of knowledge, and their Web sites can and should support this. A feature that can be found on some department Web sites is genuine online content, that is, subject material resources [2]. Another outlet is themed resource sites. Many exist and they are generally glad to list sites. For example, if your department has developed a set of math resources on the Web, submit the site to Merlot (www.merlot.org) and Education Planet (www.educationplanet.com/topsites/math.html). Maintaining such resources provides an explicit way to promote your programs as well. In time, we believe that department Web sites will do this more often.

Perhaps a simpler method of spreading the news about a site is to mention it in other media. Use newsletters or publications in other media to advertise your site. Ask sites related to your site to post a link to your site, and in exchange post a link to their site.

The best marketing plan, we believe, is to get listed on all the major search engines, as well as directed reference guides. It is better to target your users by submitting to Web sites that they will visit. These later links will occur when your programs are connected to their users and communities.

Maintenance. Nielsen and others suggest that an annual maintenance budget be set that is equal to the initial cost of building the site [3]. Web sites that become outdated decline in quality, so you should protect your investment by spending time to maintain it and keep it up to date.

This task analysis explains why maintenance takes so much effort. The topics in Table 4 are broad, and many change frequently. A successful department Web site will require as much maintenance as an average commercial Web site. Hyperlinks must be routinely checked to assure that they still work, although fortunately, software tools and services exist to do this [5]. We also believe a practical way to keep a Web site updated is to devolve the maintenance of the Web site from the Web master to those who create or manage the information directly. This approach has the person maintaining the paper phone list also maintaining the Web version. A list of materials as in Table 4 provides a way to manage the updates.

Adding a search capability within your Web site is a very convenient feature for users. While we would like users to have access to information without searching, Table 4 illustrates what they have to wade through. We found that the search logs can give rise to important suggestions for Web site design and maintenance as well. A variety of solutions exists, from using popular search engines’ plug-ins to your own site to including an externally hosted search engine.

Summary

We created an initial task analysis of the audience for university department Web sites and the types of information users seek there. The list was developed through several analyses. While the tasks are not surprising on their own, their breadth suggests a wider and deeper use than we thought we would see when we started out to create this list.

The list of tasks provides useful suggestions for improving department Web sites. The results of checking existing department and school Web sites against our list suggested where the sites we examined could be improved, as well as where the task analysis could be extended.

This list provides an example design document for other types of sites. Its size suggests that similar Web sites will require a similarly sized list. It is not such a long list that it cannot be created again in a week for a new type of site or even a few hours for a similar site. It is clear to us, however, that this size of list cannot be simply remembered or created without some external memory aids.

The existing task analysis is unlikely to be complete. As we saw by examining additional Web sites, we found more tasks that some sites supported. Testing further Web sites is likely to help extend the analysis further with additional tasks and types of users.

Some departments will want to emphasize specific features about themselves. Others, perhaps those whose academic discipline is related to Web sites, will emphasize some aspect of the site itself, perhaps design, perhaps usability, or perhaps demonstrations of their work. Additionally, different assumptions about the users such as bandwidth (do they have dial-up access or broadband access?) and the usage of the site (will the pages often be printed?), will give rise to differences in design. This is to be expected.

There is a cost to testing a Web site using this list. It takes about an hour to work through the list of tasks and search the Web site to determine if the information is available. In the future, we can imagine that this task, like many in usability testing, could be automated (e.g., see [5] for a general review). Cognitive models could be used as surrogate users to automatically check entire Web sites against our task analysis. We are working on this ([11] and [12]).

Perhaps the largest lesson that we continue to relearn is that the online world parallels the real world. The task analysis, taken as a whole, suggests that nearly all the constituencies of a university department now interact with its Web site, and that nearly all the tasks and work that departments do are mirrored on the Web site as well. This task analysis then can be informed by the physical, administrative, and even social structures of a department, and can help support them in their tasks in turn. As more departments provide more information, the more users can count on the Web for information.

Acknowledgements

This research was sponsored in part by the PSU Minority Undergraduate Research Experience, and in part by the School of IST’s Solution Institute. User logs were provided by the Web master for the School of IST’s homepage, Rose Pruyne, and by Penn State’s Web master. Kate Simpson and an audience at the U. of Melbourne’s School of Information Science’s department seminar series provided helpful comments.

References

1. Beevis, D. (Ed.). (1999). Analysis techniques for human-machine systems design: A report produced under the auspices of NATO Defense Research Group Panel 8. Wright-Patterson Air Force Base, OH: Crew Systems Ergonomics/ Human Systems Technology Information Analysis Center.

2. Block, M. (Ed.). (2002). Doing it right: How some universities encourage the creation of prime research Web sites. Searcher, 10(8), 1-8. Retrieved 18 November 2002, from infotoday.com/searcher/sep02/block.htm

3. Brinck, T., Gergle, D., & Wood, S. D. (2002). Usability for the Web. San Francisco, Morgan Kaufmann Publishers.

4. Cheyne, T., & Ritter, F. E. (2001). Targeting respondents on the Internet successfully and responsibly. Communications of the ACM, 44(4), 94-98.

5. Ivory, M. Y., & Hearst, M. A. (2001). The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4), 470-516.

6. John, B. E., & Kieras, D. E. (1996). Using GOMS for user interface design and evaluation: Which technique? ACM Transactions on Computer-Human Interaction 3(4): 287-319.

7. Nielsen, J. (1997). Top Ten Mistakes of Web Management, Alertbox. www.useit.com/alertbox/9706b.html

8. Pennock, D., Flake, G. W., Lawrence, S., Glover, E. J., & Giles, C. L. (2002). Winners don’t take all: Characterizing the competition for links on the Web, Proceedings of the National Academy of Sciences, 99(8), 5207-5211.

9. Raskin, J. (2000). The humane interface, Reading, MA. Addison-Wesley Pub Co.

10. Ritter, F. E., Freed, A. R., & Haskett, O. L. (2002). Discovering user information needs: The case of university department Web sites (Tech. Report No. 2002-3). Applied Cognitive Science Lab, School of Information Sciences and Technology, Penn State. acs.ist.psu.edu/acs-lab/reports/ritterFH02.pdf.

11. Ritter, F. E. & Young, R. M. (2001). Embodied models as simulated users: Introduction to this special issue on using cognitive models to improve interface design. International Journal of Human-Computer Studies 55: 1-14.

12. St. Amant, R., Horton, T. E., & Ritter, F. E. (2004). Model-based evaluation of cell phone menu interaction. In Proceedings of the CHI’04 Conference on Human Factors in Computer Systems. 343-350. New York, NY: ACM.

13. Schraagen, J. M., Chipman, S. F., & Shalin, V. L. (Eds.) (2000). Cognitive task analysis. Mahwah, NJ: Erlbaum.

Authors

Frank E. Ritter
The Pennsylvania State University
frank.ritter@psu.edu

Andrew R. Freed
The Pennsylvania State University
arfreed@nc.rr.com

Onida L. M. Haskett
The Pennsylvania State University
olh102@yahoo.com

About the Authors:

Dr. Frank E. Ritter helped start the School of Information Sciences and Technology at Penn State. He is also affiliated with the Psychology and Computer Science Departments. He is interested in the cognitive modeling of users as a way to test interfaces. At one point he was the Web foreman at the University of Nottingham’s Department of Psychology. He has degrees from UIUC and from CMU. He is working on a textbook, the ABCs of HCI, with Churchill and Gilmore.

Andrew R. Freed received his BS and MS degrees in Computer Science and Engineering from Penn State. His thesis was on evaluating telephone interfaces using a cognitive model. He presently works at IBM and can be reached at arfreed@nc.rr.com.

Onida L. M. Haskett received her BSEE degree from Penn State. She is currently teaching Mathematics in a Junior High School in New York City. In addition to teaching, Onida is also attending graduate school part-time, pursuing a MEd in Secondary Education-Mathematics. She can be reached at misshaskett@hotmail.com.

Tables

T1Table 1. An unordered and nonexclusive list of types of users of university department Web sites

T2Table 2. Registrar-related tasks, grouped by keyword including synonyms

T3Table 3. Items for department Web sites from the PSU logs.

T4AT4BTable 4. Comparison of department Web sites. (w) indicates tasks added from initial Web site examination, (h) from hard-copy materials, (s) from search queries, (o) from open interviews, * from our comparison, ** based on post-analysis feedback. n/a indicates not applicable, bullet indicates a supported task.

T5Table 5. Do-it-yourself Web-site submission to search engines

Sidebar: Penn State

Data from Penn State’s School of Information and Science Technology Web site indicates that the domain ist.psu.edu registered over 640,000 page views from over 116,000 unique visitors in 2001 alone (based on the Webtrends 2001 report for ist.psu.edu), and this has only increased in the years following. Nielsen/Netratings reports that there are an estimated 450 million users of the Internet, and that at any given time approximately 250 million of them are actively using it (www.nielsen-netratings.com).

With so many people using the Internet, it is important for university departments to provide the right information to their users, who are increasingly online.

©2005 ACM  1072-5220/05/0900  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.

 

Post Comment


No Comments Found