Dennis Wixon, Randy Pagulayan
In the September-October issue of interactions, creative director for the Windows Core Innovation Team, August de los Reyes, and I described an approach to designing emotionally engaging products. The approach is based on the James-Lange theorya pioneering theory of emotion that places physical activity as the source of emotions, rather than a product of emotion. In this approach emotions are a “readout” based on our activity and the context in which it occurs. This has clear implications for what user researchers focus on during the design of products, and speaks toward the relationship between researchers and designers as they work together to create compelling products.
In the article mentioned above, we pointed out the parallels between this approach and the framework of mechanics, dynamics, and aesthetics that is used extensively in game design developed by Marc Leblanc (http://algorithmancy.8kindsoffun.com/). In that approach designers control mechanics; the behavior of players is considered dynamics and the conclusions that players reach about the game are aesthetics. Halo 3 is an example of the application of this framework and the James-Lange theory.
Halo 3 is the third game in the Halo series; it is a first-person shooter developed by Bungie Studios for the Microsoft Xbox 360. The game, released in late September 2007, holds the record for the highest grossing opening day in entertainment history, making $170 million in its first 24 hours. This achievement is even more striking when we consider that most videogames lose money.
During the design of Halo 3, we were able to collect and analyze large amounts of behavioral data and monitor conclusions users reached about the game. This combination of behavior and conclusions was critical. Games designers are reaching for an aesthetic experiencean emotional conclusion about the game. But that aesthetic experience is based on how users play the game. By synchronizing both behavioral and aesthetic measures we were able to provide the design team key evidence of when their intent was not realized. Without both of these measures designers would not have been able to make fully informed decisions regarding design changes.
Three key characteristics of the data reporting were:
1. Data reported very quickly. As many of you realize, timeliness of data is a key to effectiveness. Toward the end of any development project, hard decisions must be made to get the product out the door. No one can “wait for the data to come in.” So for us, every second counts. There is little time to spend doing a thorough analysis on thousands of data points. In Halo 3, we needed to be able to collect hundreds of hours of player time over a weekend and turn around our recommendations within a day or two.
2. Data reporting that speaks to designers. Although we are the experts in our field and in data analysis, we should stop assuming our partners can’t handle looking at numbers. One of the key facets to our approach was presenting the player experience using numbers and charts directly to designers and letting them do some exploration themselves. The behavioral data we collected was analyzed and plotted in terms that made sense to the designers, such as the location of player deaths during a mission. The reporting system supported easy, one-click drill-down to deeper levels. This required investment up front in terms of the research questions they wanted answered, and our being able to build views of data that were simple to understand and easy to identify problems in.
3. Data reporting that links quantitative and qualitative data. Straightforward linking between quantitative data (number of deaths) and qualitative data reporting (video) was another critical factor. Starting with quantitative data provided us and designers with the means to detect problems. Drilling deeper provided understanding. Drilling all the way down to video provided data in its full context and helped make the findings actionable. In other words, both the researchers and designers could see where problems were and how to fix them.
In addition to having a toolkit that was based on these principles, we developed a strategic approach to our research for Halo 3 based on our previous experience with Halo 2:
1. Combined formative and summative evaluation. In Halo 2 most of our user-testing efforts on the single-player campaign were formative. Every session was about drilling down into that week’s version of the game, finding problems, and fixing problems from the previous week. In Halo 3 we extended this by adding a summative component to our measures. It was important to us to track how we were doing on several important high-level constructs from week to week and over the course of all of our testing. We were measuring the entire game experience as a whole, as opposed to just focusing on the drill-down details.
2. Address broader organizational needs. We’re all familiar with reasons why user experience professionals use real consumers as the major source of data collection. What we need to remember, though, is that many of our processes for data collection can also be applied to those working on the product. The Halo 3 team participated in internal playtesting at strategic points in the development cycle to give them a chance to experience the entire game experience in a “clean” way. This was critical, because it forced the team to step out of their normal role and become “player” again, giving them the ability to see things they normally wouldn’t have.
Although this work has also been described in some depth in Wired (http://www.wired.com/wired/issue/15-09/), in this essay we’ve presented the theory behind the work and the strategic context of the work.
Finally and most important, without the commitment and creativity of the Bungie team, the best research in the world would make no difference. Our duty as user researchers is to produce the clearest, timeliest, most holistic, and most actionable data possible. We’ve outlined how we did that for Halo 3. We wish every user research the opportunity to work with a team like Bungie at least once in their career.
Life is short, have fun.
Microsoft Game Studios
Microsoft Game Studios
About the Authors
Dennis Wixon leads a team of more than 20 at Microsoft Game Studios, which provides consulting and research to make games fun. He is also a member of the User Experience Leadership Team, a corporate steering group. Dennis previously worked at Digital Entertainment Corporation and has been an active member of CHI. He has authored many articles on methodology and co-edited Field Methods Casebook for Software Design (John Wiley & Sons).
Randy Pagulayan has led user research efforts on multiple games at Microsoft Game Studios, including Halo 2 (Xbox) and Halo 3 (Xbox 360). Randy also co-authored several book chapters on user-centered design in games and has been an invited speaker for the Nielsen Norman User Experience Event in 2003 and the Center for Computer Games Research at the IT University of Copenhagen in 2005. Most recently, he was featured in a cover story in Wired magazine (September 2007).
©2008 ACM 1072-5220/08/0100 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2008 ACM, Inc.