In the 1983 movie WarGames, teen hacker Matthew Broderick accessed NORAD headquarters through dial-up, guessed the password to the W.O.P.R. supercomputer, and nearly destroyed the world in a game of thermonuclear war.
Armageddon in three steps? Now that’s ease of use!
Not very secure, though.
Between Windows XP SP2, the pre-installed Symantec security bundle, and a Cisco VPN client, I have three firewalls on my home computer and I don’t know if that is a good thing or a bad thing.
One of my firewalls likes to ask me if I want to allow .exes I’ve never heard of access the Internet. I paid pretty close attention to the first 100 warnings but now I mostly answer "yes," especially if I am in a hurry .
Sometimes it seems that any program running will interrupt what I am doing to ask me for a decision whose security implications I don’t understand. Systems designers with the best intentions strive to provide security checks and prompts, and to hide complexity, but the combined effect on the daily computer experience is a cavalcade of pop-up warnings about email attachments, Web content, and software updates du jour.
Even when I deliberately try to implement a security measure, the process is often clunky. In the simplest case, restricting access to a public file share in Windows XP requires about eight separate steps and five distinct interfaces.
And then there are passwords, the mascot for poor user-centered security. I have guessed the PIN number to my wife’s bank card, and a friend of mine keeps his work-related passwords on a Post-It note stuck to his laptophe is an investment banker .
To sum it up, the current state of the user experience around computer security can be characterized as cumbersome, dauntingly complex, and sometimes spontaneous.
A key challenge faced by designers of security features and applications is ensuring that users will use them and use them appropriately. On the surface, usability and security seem so contradictory that combining them could spark a matter/antimatter reaction that vaporizes the product team. After all, one advocates ease of use, the other restriction of use, right? The problem is not a religious conflict between two ideologies, but devilish implementation details. Security mechanisms are often implemented in a way that conflicts with user goals and ease-of-use principles.
This is not a new problem. It has evolved along with computing technology, albeit a step behind. In the early days, it was reasonable just to secure physical access to your machine and the computers that composed the ARPANET (forerunner to the Internet) were controlled by a community of users who generally knew and trusted each other. Given this kind of environment, the early networking protocols, still in use today, were designed for openness and flexibility.
However, as more and more users began sharing time on single machines and as more machines were connected to networks, the need for security controls grew. To meet this need, security mechanisms were retrofitted into systems never designed for security to begin with, adding layers of complexity and security holes for decades to come. As new technologies emerge and computing devices and services become more interconnected, the landscape of security threats is ever changing.
The concept of computer security has also evolved over time. In the classic computer-science sense, security technically referred to mechanisms for proving to a system who you are (authentication), determining what you are allowed to do (authorization), and tracking who did what (auditing). This additionally included mechanisms for ensuring data privacy (confidentiality), guaranteeing data has not been changed (integrity), and ensuring data or services are available to users (availability). Social engineering and things like phishing tactics were traditionally considered practical security concerns but not technical ones.
Users have a different set of concepts about what computer security is. Talk with home users about security and you learn that to them, computer security covers any bad thing that might happen on, to, with, or around their computer. In the users’ definition of the term, computer security includes spam, spyware, pop-up ads, and even parental controls. In response, major software vendors now offer centralized one-stop solutions for computer protection to assuage your fears of any kind of digital malevolence.
In the past, many security features and applications lacked iterative design and usability support because they were considered advanced tools. This is not the case anymore. The expected skill level of the target audience for security features and applications is dropping and the scope of security is expanding. As computer technology continues to fuse with mainstream pop culture, public awareness of security issues will grow. The first truly widespread virus attack on cell phones will grab international attention.
What will the future of the user-security experience be? Imagine yourself a few years from now, struggling to change the firewall settings on your iPod phone because you need to download a patch for the Maria Sharapova virususing the click wheel. Now that is a scary thought.
Security is as ubiquitous as it is nebulous. Given the expanding definition of security and intricacy of details, how should we treat user-centered security? Should designers find ways to make the guts of security more transparent so users understand it? Are user-security interactions unnecessary? That is, is security fundamentally a system-level issue that we should hide from users at all times? Which provides a better experience?
This special issue on HCI and security presents a range of user-security issues and challenges. From the design of usable security mechanisms and evaluation methods to insights into user and environmental factors, these articles illustrate the breadth of what the HCI community can offer to improve computer security and, in the end, make the world a better place.
1. A recent survey from AOL and the National Cyber Security Alliance reported that roughly 72 percent of home users in the US do not have a properly configured firewall. America Online and the National Cyber Security Alliance (2004). AOL/NCSA Online Safety Study. http://www.staysafeonline.info/news/safety_study_v04.pdf
2. Laptop theft was the third most common corporate security incident reported last year by an annual FBI cybercrime survey. CSI/FBI 2005 Computer Crime and Security Survey (2005). Computer Security Institute. http://www.gocsi.com/press/20050714.jhtml
About the Author:
Ryan West is a user-experience researcher who has studied enterprise-class systems administration at Microsoft and now SAS Institute. He has conducted academic research in risk and decision-making and applied research in areas ranging from medical mistakes to computer security. Ryan has a PhD in cognitive psychology from the University of Florida.
©2006 ACM 1072-5220/06/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2006 ACM, Inc.