At the CHI 2001 conference, I was on a panel with Rolf Molich, Brenda Laurel, Whitney Quesenbery, and Carolyn Snyder  that discussed ethical dilemmas in HCI. I think that HCI practitioners face ethical issues all the time, but ethics is a bit like sex or religionpeople are uncomfortable talking about it in public forums. My past may have influenced my perspective. I have been the chair of a military human use review board, worked on a draft of the UPA Code of Conduct, been involved in a court case that focused on the ethics of experimentation, and conducted research on victims of crime that had significant ethical implications [3, 6]. I believe that HCI practitioners face ethical issues nearly every day in the planning, conducting, analyzing, and interpreting of design and evaluation activities. This column will address ethical dilemmas that HCI practitioners are likely to face and discuss how to resolve them.
Video Archives and Organizational Access to Videos Create Ethical Problems. The rapid evolution of digital video technologies makes it easy to create a video archive of usability testing, participatory design sessions, and other HCI activities. Streaming video allows colleagues to view usability testing individually rather than in a group. These video technologies allow anyone on a product team (or the entire organization) to watch usability testing, which could foster more involvement by product teams and other stakeholders. Colleagues don't have to go to the lab to watch sessions; they can watch from the privacy of their own offices. This remote-viewing scenario poses several ethical problems [2, 4]. If colleagues are watching video from their offices, then you lose some control over exactly who is watching and whether the viewers of the video understand that they must not discuss the session. Most consent forms have confidentiality statements and also indicate who (often the product team) has access to the data from a usability test, so with streaming video and video archives, it would be easy to violate the informed-consent agreement with participants. To take ethical advantage of video technologies, you could do several things:
- Require anyone who does remote observation of usability testing or other activities to attend a briefing on ethical procedures for remote viewing before they gain access to any streaming or archived video.
- Limit access of the video to colleagues who are identified in the consent form signed by the participant.
- Before the video of a session is displayed, present the viewers with the rules for discussing what they view and require them to "sign" an electronic agreement stating that they will not talk about what they see outside of the product group.
- Do not include pictures of your participants in the video; just show the screen or product they are using without the face view.
Protect Your Internal Research Participants at Least as Much as Your External Participants. Internal users, your colleagues in the office, risk their reputations if their managers or peers observe them performing poorly in a usability test or other HCI activityespecially if the test or activity involves key job-performance areas. Internal participants often do not receive the same briefing, debriefing, and consent form that external participants are required to review and sign. The rationale for skipping some of the consent procedures that we routinely use with external participants is that employee agreements have clauses that give the organization the right to gather and record information from employees. The fact that something is legal does not always make it ethical. Given the potential risk to reputations and credibility, internal participants should have at least the same, and possibly more protection than external participants. To protect internal participants, I would recommend the following:
- Managers should not be allowed to observe their employees in a usability test or other activity where those employees are in the role of research participant.
- Data from internal research participants should not be stored with any identifying information.
- Internal research participants should be given the same briefing, consent form, and debriefing that external participants would receive. Though it may seem like an odd thought, perhaps the consent form should mention "negative reactions of colleagues to your performance" as an actual risk.
- Data from research involving new employees who have not established a track record should be protected as if it were a national secret. While new employees are valuable because they can provide the perspective of a novice, their credibility is tenuous and could be hurt if their performance is perceived as subpar.
Pilot Test the Consent Form, Briefing, and Debriefing. Participants must understand what is in the informed-consent form as well as the content of any verbal briefing regarding the study. The text and any explanations provided by the facilitator must be comprehensible and clear. The use of legalese and technical jargon not generally understood by your sample of participants would be an ethical breach and may lead to legal repercussions if anything goes badly in your study. You might be tempted to apply readability formulas to your consent form, but readability formulas focus on word and sentence length and are poor indicators of understandability. They do nothing to reveal missing information, incorrect information, or organization or cultural faux pas.
Ancker  tried using the Flesch-Kinkaid readability formula to develop a consent form for patients of traumatic brain injury, but found: "The scale cannot predict readers' reactions or account for the quality or accuracy of the information, the document's visual design, or its cultural and rhetorical appropriateness. The scale is sensitive to word length, but it cannot identify short words that might be unfamiliar. The scale is even more sensitive to sentence length, so that dividing a single sentence can change the grade of an entire document." Her conclusion was that pilot testing the consent form was the only way to ensure that it was comprehensible for the target audience. I agree with this conclusion and encourage you to pilot test your consent form, briefings, and debriefings in the same way that you pilot test procedures and tasks.
Can a Person Really Give Informed Consent for the Use of Recorded Data Before They Perform an HCI Activity? When people sign a consent form and agree that their data, often video of the usability test or interview session, can be used for specific purposes (this is sometimes called a waiver), they don't know what they will do or say during the session. Can a participant really give informed consent about future activities? Legally, yes. Ethically, well, it gets a bit sticky. In the moment, people may forget that they are being taped and do or reveal things that they wouldn't really want to make public. I've taped sessions where people admitted to copyright violations, health problems, and incompetent bosses. None of these people asked me if they could review their videotape or rescinded their consent for use of the recordings. There may be times when it would be more ethical to ask explicitly at the end of a session if the participants would like to review their recordings and let the facilitator know if they are still willing to allow the organization to use the recorded data for our stated purposes. Obviously, that would make logistics more difficult, but philosophically, the participants would be in a more informed position to give permission for the use of recorded data.
Create a Simplified Nondisclosure Agreement for Short HCI Activities. I have been invited to test products or give my opinion in interviews. Before I test the products or give my opinion to the interviewer, I'm asked to sign a two- to three-page organization nondisclosure that requires a senior lawyer to decipher. These NDAs are sometimes appended to the informed-consent form. Complex NDAs are required when customers receive products for detailed use and you want to prevent loss of your intellectual property, but most short HCI activities don't provide the kind of product access that mandates a complex NDA. Consider working with your legal department to create a short version like the one below:
"Since you are working with an unreleased product, you agree not to discuss the product's features or your experience with the product in this study for a period of [insert time period]."
We face ethical issues every day in the design and execution of HCI activities. I've highlighted some ethical issues that I've experienced and recommend that colleagues take some time to reflect on the ethical context of their work with internal and external stakeholders. We need to design and evaluate the components of our ethical frameworks constantly and develop explicit best practices to ensure that we retain credibility and trust.
2. Burmeister, O. K. "Usability Testing: Revisiting Informed Consent Procedures for Testing Internet Sites." In Selected Papers from the Second Australian Institute Conference on Computer Ethics vol. 7. Edited by J. Weckert. ACM International Conference Proceeding Series. Darlinghurst, Australia: Australian Computer Society, 2000, 3-9.
4. Mackay, W. E. "Ethics, lies and videotape...." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Denver, Colorado, May 7-11, 1995. Edited by I. R. Katz, R. Mack, L. Marks, M. B. Rosson, and J. Nielsen. Conference on Human Factors in Computing Systems. New York: ACM Press/Addison-Wesley Publishing Co., 138-145.
5. Molich, R., B. Laurel, C. Snyder, W. Quesenbery, and C.E. Wilson. "Ethics in HCI." In CHI `01 Extended Abstracts on Human Factors in Computing Systems. CHI `01 Seattle, Washington, March 31 - April 05, 2001. New York: ACM Press, 217-218.
6. Wilson, C. E., and M.S. Greenberg. "Victims reactions to crimes: A laboratory experimental approach". (1976). Paper presented at the annual meeting of the American Psychological Association, Washington, DC.
Chauncey E. Wilson
ABOUT THE AUTHOR
Chauncey Wilson is a usability manager at The MathWorks, instructor in the Human Factors and Information Design Program at Bentley College in Boston, and author of the forthcoming Handbook of Formal and Informal User-Centered Design Methods (Elsevier). Chauncey was the first full-time director of the Bentley College Design and Usability Testing Center and has spent more than 25 years as a usability practitioner, development manager, and mentor. In his limited spare time, Chauncey hones his culinary skills as an amateur (but very serious) chef.
©2007 ACM 1072-5220/07/0700 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2007 ACM, Inc.