The democratization of design

XVI.5 September + October 2009
Page: 40
Digital Citation

PS AND QSResearch automation as technomethodological pixie dust


Authors:
Elizabeth Churchill

Like many people in these times of cost saving, I have lost my sense of humor. Why? Because what constitutes reasonable economizing—determining what costs are or are not optional—reveals some curious assessments of value. I am referring to human-centered research and the peculiar assumption that qualitative research methods—like qualitative interviewing and ethnographically inspired fieldwork—are too expensive to use in the design process during lean times.

There are plenty of companies willing to jump in and offer cut-rate services, but, as in all of life, one should be aware that “you get what you pay for.” Timothy de Waal Malefyt’s recent article in American Anthropologist details how corporations are turning to “multiple ethnographic vendors to compete for projects in bidding wars.” He states that “ethnographic companies and market research anthropologists no longer assume a solitary uniqueness for their practice but often distinguish themselves by branding.” He focuses on brand differentiation through evocation of inventiveness, demonstrated through technologies of data capture—cell phones, video cameras and reporting platforms like blogs are all examples of novel, sexy “technomethodologies.”

The point of differentiation that interests me most is cost cutting; technologies of automation have always been coupled seductively with cost savings. There are plenty of companies competing for business by offering quicker, faster (often capitalized: FASTER) results—time is money and less time is money saved. So what can be cut or compressed in qualitative research planning and execution? How can qualitative research services remain effective while cutting costs?

1. Outsource the experience of being there. To do really good qualitative research, I personally believe there is no substitute for being there—actually interacting with people where they normally hang out and where they work/live/play. I do, however, realize that sending trained researchers into the field is not always possible. A popular solution is to outsource the data gathering to study participants; I call this the do-it-yourself approach to data gathering. The assumption is that by asking people to produce personal records of their actions, in situ, documented with video, imagery, and text, we can outsource the work of “being there”—data straight from the horse’s mouth has authenticity. While I don’t doubt that some people are really good at reporting their experiences, when accounting for and recounting their actions, doing good “auto-ethnography” is hard. It is not simply reportage of what happened. It entails developing reflexive understandings of one’s own experience, and a consideration of the sociocultural milieu in which one is operating. If not conducted well, the results of the presented-self might not be reflective of participants’ actual everyday practice, instead revealing a neatly constructed, sampled view of some ideal of everyday life. Using this data to offer design recommendations may be on par with designing domestic spaces based on how people live in “at home” via reality-TV shows. Researchers that offer quality services stress the importance of iterative data analysis—they do not take recorded or reported raw data as uncontestable or final.

Less convincing are services that cut costs and increase efficient results by automating online interviews and focus groups. For very limited domains, it may be possible to automate data gathering of this kind. However, this is not qualitative research. Putting a human in the loop does not necessarily make research qualitative; telephone surveys that ask interviewers to follow pro forma scripts in effect turn humans into data-gathering robots. Qualitative interview studies—whether conducted by phone or online—involve sensitive, active listening. That is, listening carefully to nuance and inflection, offering follow-up questions, working to build trust, and being candid. Qualitative interviews are carefully guided conversations.

2. Reduce time spent on the project. Some time in the field is better than no time. But sloppy work is expensive, and to really have effective short-term qualitative investigations, you need to triangulate—think about your questions and use all the methods you can to hone in on the core question that needs answering. In the mid-1990s, Tony Salvador and Michael Mateas coined the phrase “garage ethnography” to talk about quick-turnaround field investigations. They understood that there are a lot of different methods for researching and understanding people and their activities: Some time in the field was better than no time. Moreover, one should be circumspect about the results and be clear with caveats as well as talking points. That is, they pointed to the limitations as well as the benefits of short investigations in deriving thick and thin, detailed and abstract, descriptions. These were researchers who knew how to go from quick to long studies, to triangulate methods, and to recognize when the data they collected did not deliver results.

Many companies are really good at “quick and dirty” qualitative studies. The good can be separated from the bad by a few questions that should have well-thought-out answers: how/why did you select the location(s) and activity setting(s), how/why did you select your informant(s), what data did you gather and how, what analyses are you doing, what has surprised you…. There are more questions to pose, of course, but these are the basics.

3. Simplify data analysis, automate the report generation. One way to cut costs is to eliminate the analysis phase, and/or have data crunchers—human or machine—simply summarize the results. In a recent analysis of available services, several of the sites I reviewed brag about their automated analysis techniques. I am deeply suspicious of such claims. Data analysis should be an ongoing weaving of themes. Those themes should contribute to a broader explanation of theoretical or practical import and guide a summary report for discussion. A focus on discovery is important—verifying established ideas should not be the only goal. Qualitative studies are reflective and reflexive, analysis and synthesis involve finding patterns, and really good research requires time for going back and revisiting one’s assumptions. If someone is not carefully analyzing and thoughtfully synthesising data, he or she is doing you a disservice.

Perhaps more controversially, I would caution against research plans that do not entertain possible revision in response to ongoing data analysis. In a qualitative study, research activities take shape gradually, as meaning unfolds through ongoing, concurrent data collection, and analysis. Like recent perspectives in software development, research processes should be agile. Rather than signifying clarity, professionalism, or certainty, research that is executed precisely to prespecified procedures may signify rigidity and lack of sensitivity to the issues at hand.

As the consumer of services, you should ask to see previous reports that are available for public release. Beware of slide decks filled with data that do not make sense to you; ask what analysis methods were used; ask about concurrent analysis processes; and ask about interim research process reviews. If your analysts have never heard of field memos or content analysis techniques and cannot tell you how they analyzed the data they gathered for you, and you don’t get to see any of the raw data (anonymized, if appropriate), be wary. Good researchers leave you with the knowledge of how to replicate the study, not just the study results. Finally, be very suspicious of anyone who tells you how many hours of data they gathered and how many lines of transcription they analyzed and waits for you to gasp with excitement at the impressive figures cited. This is akin to judging a computer program by the number of lines of code.

4. Automate your participants and their worlds. There are companies building simulated worlds and 3-D mannequins who inhabit them—today’s crash-test dummies. In 400 B.C., Archytas of Tarentum made a wooden pigeon suspended from the end of a pivot that was rotated by a jet of water or steam; more than 2,000 years later, in 1868, Dederick’s patent for a steam-powered man was issued. No one seriously thought these were actual substitutes for a pigeon and a man. Beware the simulated world with simulated people: Unless you are simply looking for “goodness of fit” or an approximate model of use, these are not useful. Remember, a crash-test dummy is just a dummy that conforms to certain norms of shape and size, and is necessarily limited in its agency and responses to events.

I certainly believe that simulations can show you what your assumptions are and where to look further, to point to questions that need asking. However, simulated worlds cannot successfully model all the possible nuances and contingencies of action and interaction between people and between people and things. Nor can they successfully model the effects on action and interaction of local social politics, or the distracted and sometimes disruptive presence of a person in love or pain. Even the most exceptional models are bounded by the creativity and experience of the human minds that developed them—specified their ontological structure and their parameters for action and reaction. To put it simply, in general, a model is not good at thinking outside its own box.

Therefore in my opinion, if you genuinely care about your user base, about innovation, about understanding how your services and products are taken up, then it behooves you to eschew the discounted route and to actually spend some time talking to people and observing how your products and services are adopted and adapted in real life. To achieve valuable strategic insights, it is important to engage a consultancy or company that knows what it’s doing, and/or hire good people and have a well-resourced in-house team. And you have to have the vision to evaluate outcomes on a time frame that may span or cut across typical corporate, quarterly, accounting budget cycles. Like good social applications, good research can take a while to conduct and mine for reliable results.

So, if your research budget is getting tight, don’t reach for the red pen and cut qualitative research for cut-rate services. Don’t ask your designers and strategists to innovate without a true understanding of current and potential “users.” That’s like asking them to race in the Le Mans Grand Prix of Endurance on a tricycle.

Do think about your questions in detail, understand the purpose, and identify the narrow and broad deliverables you can get from the research. Here are some things I think about when viewing a website that claims to offer budget deals on qualitative work. Qualitative research is a way of looking at the world, of engaging with people, with data and with one’s own methodological assumptions—a qualitative approach is not an adherence to collecting a particular form of data collection or data representation. Therefore, look for people who are willing to take a mixed methods approach. Look for people who know how to ask questions and think about how methods can be combined to answer questions. Beware those who proffer a laundry list of techniques. Look for companies that have at least one person (and I mean at least one!) on the team and/or advisory board, who actually trained in behavioral analysis or human-centered technology design. If no one on the team cared enough about people and/or design to study anything in these areas, the company should not be entrusted with your questions—or your money. Look out for words and catchphrases that are simply used for service differentiation to catch your eye. Be cognizant of terms like “qualitative,” “ethnography” and “field study.” They are not magic pixie dust; they imply careful, considered, and well-conducted research.

Author

Elizabeth Churchill is a principal research scientist at Yahoo! Research leading research in social media. Originally a psychologist by training, for the past 15 years she has studied and designed technologies for effective social connection. At Yahoo, her work focuses on how Internet applications and services are woven into everyday lives. Obsessed with memory and sentiment, in her spare time Churchill researches how people manage their digital and physical archives. She rates herself a packrat, her greatest joy is an attic stuffed with memorabilia.

Footnotes

DOI: http://doi.acm.org/10.1145/1572626.1572634

©2009 ACM  1072-5220/09/0500  $10.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2009 ACM, Inc.

 

Post Comment


No Comments Found