XX.3 May + June 2013
Page: 62
Digital Citation

A guerrilla usability lab with free software

José Collado, Paul Mora, Elizabeth Parham

Usability, or the quality of a user’s experience when interacting with a system, is a prerequisite for new application delivery within our organization. Nevertheless, a significant percentage of the applications being developed explicitly ignore current best practices. Their authors cite lack of time, lack of qualified personnel, or lack of an adequate space to carry out usability tests. In effect, the notion that usability testing is expensive, or something that needs substantial resources, often precludes the adoption of usability principles throughout an application’s lifecycle.

Here, we describe a portable, low-cost usability lab targeted at in-house developers, stakeholders, and other interested parties. With it, we aim to empower and encourage development teams across organizations to perform small-scale, routine usability testing.

Although usability labs are not the only or the latest solution for user experience, their adoption will provide valuable metrics of effectiveness, efficiency, and satisfaction (following ISO-9241 dimensions of usability) that will facilitate the decision-making process of interface and interaction design. Our aim is to bring greater quality and consistency to the interface layer across in-house applications, and avoid degraded user experience.

The primary goal of the usability department is to promote the understanding and application of usability best practices within the organization. To bring usability into view, the department has established a hotline, providing free user-experience consultation to in-house development teams. It has also focused on distributing relevant usability-related material to each of the groups involved in the development process, and has produced resources for the different stakeholders to showcase internal practices, such as the introductory usability guidelines (for all users), use cases (for team leaders), code templates (for developers), and more.

Efforts are aimed at long-term, deep effects, but we realize that changing habits in the workplace is a difficult proposition. We see this usability-lab rollout as an opportunity to engage directly with developers and other stakeholders.

Usability Labs

A usability lab is, in essence, an observation platform for learning from the interaction between a participant and an application. Within our organization, labs are normally used to spot problems in prototype interfaces.

A modern implementation of a usability lab should be able to produce metrics, both qualitative and quantitative, conforming to the ISO-9241 standard, which characterizes usability in terms of effectiveness, efficiency, and satisfaction. These metrics, produced after each testing session, enable assessment of the changes made to an application before rollout, and allow for the tracking of usability throughout an application’s lifecycle.

Labs can also reproduce live or recorded video sessions with the participants. Often, when developers watch real people using their applications—usually struggling with it in some way—they internalize the importance of usability and become aware of and interested in usability’s best practices, improving the user experience of their work.

Traditional usability labs. Traditional usability labs required a number of interdisciplinary researchers and relied on purpose-built Gessell rooms and what was then sophisticated recording equipment. So, logistically and economically, they were expensive investments. This is the reason why the term usability lab used to evoke—as it still does—the idea of a dedicated, physical place, and instant availability of resources.

The typical dedicated lab consists of two adjacent rooms separated by a wall with a one-way mirror—a Gessell chamber. One of the rooms hosts the observers, while the other accommodates a participant and the facilitator who will provide guidance and indications throughout the test.

The participant and the facilitator sit in front of a screen that presents the interface under evaluation, usually in an office environment, although some labs reproduce a more domestic setting for the comfort of the participant.

One or more video cameras are laid out to focus on the participant’s screen, where the tested interface is recorded along with mouse movements, keyboard presses, facial expressions, and body language. Participants are required to talk as they solve tasks—the “think aloud” method—and this verbal expression of their thought process is also recorded.

Behind the mirror, alongside the observers or in a separate room, a technician controls the camera(s), the sound mix, and other details of the recording process, while the observers watch the participant and comment or take notes about the interactions.

Guerrilla usability labs. Against the concept of resource-intensive traditional usability labs, there is a growing practice of frequent, relatively informal usability observation, using few resources and light infrastructure. This style of observation is possible thanks to off-the-shelf commodity hardware and the appearance of robust software solutions.

These guerrilla labs do not normally have a specific place to carry out their tests. Instead, they take advantage of common spaces within the organization to set up a minimalist system that mimics the functionality of traditional labs. Meeting rooms, offices, classrooms—almost any place can be conditioned to host usability testing sessions. That is why the platform needs to be portable and as self-contained as possible.

Figure 1 illustrates a typical layout of our guerrilla usability lab. The workstations are arranged in an L shape, accommodating the participant and the facilitator in close proximity, so guidance and indications can take place naturally and without effort. At the same time, the arrangement does not allow the facilitator to see the participant’s screen, so task resolution can be carried out with a sense of privacy and respect for personal space that would not happen with a facilitator staring over the user’s shoulder.

The functionality that these guerrilla labs need to cover is the same as that of traditional labs. Table 1 summarizes each of these features, along with their objectives within the bigger picture.

Implementation of a Low-Cost Usability Lab

Our experience shows that setting up an observation platform with enough quality for usability testing can be done with relative ease, without licensing fees or special platform requirements.

Hardware. The hardware aspect needs to account for cost and portability, and should be flexible enough to encompass as many scenarios as possible. We do not provide explicit specifications, but broad requirements:

  • Audio. Normally recorded along with the video feed, the use of an external, good-quality microphone is advised. If the observation room hosts more than two or three observers, it is practical to also acquire external speakers so everyone can hear comfortably.
  • Video. The video feed should carry a useful representation of the participant’s facial expressions and gestures. Although many laptops currently integrate a webcam with sufficient resolution for the job, it is often better to use an external USB webcam, because it can be plugged in and focused at different angles—helpful, considering that users can be tall or short. If the observation room hosts more than two or three people, a large screen or a projector is especially convenient.
  • Processor. Two midrange laptops supply the computing power required by the lab in a compact, portable format. Despite the load asymmetry, it is always useful to have two computers with similar capacity so that one machine can replace the other in case of breakage or malfunction.
  • Connectivity. The existence of a local area network to which the laptops can connect and exchange data and services is assumed in our context. The typical speed for network equipment in our context is 1Gbps, but multiple variables—which are beyond the scope of this article—significantly affect the available bandwidth. Though most of the time the required network conditions for using a remote desktop will be met (low latency, more than 1.5Mbps bandwidth), a few common tools are handy for troubleshooting connectivity problems.

Software. Software for the lab needs to conform to the organization’s software requirements (Windows XP SP3 and Internet Explorer 8, in our case), which restricts the choice of solutions. But the system may be easily adapted to other environments.

Whenever there is a functionally valid option, we choose free software and open source alternatives over proprietary solutions due to the advantages of the development model, the vast catalog of available solutions, and the type of licenses used, which allow for the work to be legally redistributed. Table 2 summarizes the packages we have chosen, along with the role they play as part of a usability lab.

Remote-desktop software allows the facilitator and the observers to watch a participant’s session in real time. In essence, the participant’s computer runs a VNC server, which exports the display so that different viewers can be attached to the session. As well as sending the image, mouse movements, and keyboard entries through VNC, the participant’s computer has a webcam strategically placed to show the user’s facial expressions. Although the camera uses this computer for support, in reality it is connected to the moderator’s computer, which integrates both transmissions: the one showing the user’s gestures and the one showing interaction with the application (Figure 2). The usual way to watch these two feeds is to overlap them by manually tiling the windows in a sort of a picture-in-picture (PiP) arrangement that allows for parallel observation of both the interaction and the reactions evoked in the participant (Figure 3). The resulting window composition is recorded with an external screencapture package.

Capturing usability tests on video allows for the subsequent analysis of human-computer interaction. The analysis can also produce highlights with which to illustrate any weak or strong points. Other than the capture software, it is important that the system has appropriate codecs installed, so that there can be a reasonable equilibrium between hard-drive space and video quality. In our context, usability-testing sessions take about an hour of continuous recording. After testing numerous different codecs, we settled on Xvid, the open source counterpart to DivX. We have seen dramatic reductions in file size (approximately 500MB/hour), with more than acceptable quality. Video compression is resource intensive, though, so it is very important that the computers are dimensioned appropriately for this task.

Once the video recording is finished, the raw footage is analyzed and edited to extract highlights that can illustrate relevant features in the interface, or the conclusions of the evaluation. These extracts will also provide developers with valuable feedback for future reference.

Non-linear video editors are complex tools that provide advanced multimedia features, but generally are too costly in hardware terms. The middle ground between the required functionality and a complete digital video editor, Avidemux is a powerful tool ready for the scenario just described, and with affordable hardware requirements.

In the same way, the raw audio track can be analyzed to extract highlights that illustrate the conclusions of the evaluation, especially when there is no video available (i.e., due to webcam failures, or in remote testing sessions).

Upon finishing the tasks, the moderator provides the participant with a brief survey about the user experience of the application. Although there is a long list of possible questionnaires, given our requirements, we use an adapted system usability scale (SUS) version, a quick survey with 10 questions that allow different usability factors to be evaluated.


With this prototype, we have shown that performing simple usability tests can be feasible, economical, and effective, without resorting to typical market leaders’ solutions.

Once mastered, the system is reliable and uncomplicated. Despite being less expensive and more informal, it still provides high-quality results.

When developers successfully integrate usability testing as part of their routine, they find and fix most problems early in the process, when it is still cheap to do so.

To successfully advocate usability practices, one has to walk the walk: show with real examples that usability principles are not a matter of opinion or beautiful design, but rather a set of comprehensive and interrelated practices that have been shown to work in real use cases.

If you want to get developers onboard, don’t theorize but rather show them the code: Get familiar with their jargon, understand their tools and processes. It is in this way that you might find the appropriate moment to embed usability testing within their development cycle.


José Antonio Collado began his career in research in the fields of human factors and cognitive psychology in order to combine his two passions: technology and psychology. This led to work with telecom and IT departments and to teaching assignments. He currently leads the User Experience Department at Correos (the Spanish Post Office) and is an associate professor at the Universidad Complutense of Madrid.

Paul Salazar Mora was a computer geek from an early age. His current role is helping others enjoy and get the most out of computing by employing usability in interface design. With a varied career to date, his knowledge and passion for computing, programming, usability, and marketing means he can relate to and communicate with all departments and clients.

Elizabeth Parham is a philosophy graduate who, with the then emerging possibilities the Internet promised, went on to get an M.Sc. in e-Commerce from Coventry University and has since focused on the usability-side of things. She currently works for Everis managing user experience projects around Europe.


F1Figure 1. Typical layout of a guerrilla usability lab.

F2Figure 2. Schematic of webcam and remote desktop setup.

F3Figure 3. Screen capture of moderator’s desktop showing remote desktop session and participant’s webcam image.


T1Table 1. Feature summary of a guerrilla usability lab. (Note: Other functionalities such as eye tracking, mouse tracking, click count, heatmaps, etc. have been considered but were ultimately left out to reduce the lab’s complexity and cost.)

T2Table 2. Software for a low-cost usability lab.

©2013 ACM  1072-5220/13/05  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.

Post Comment

No Comments Found