In a piece published last year in this magazine, Charles Hannon made a compelling case for careful thinking about how the words used by AI interfaces such as Siri and Alexa signal inferior status . The way we—or our AI interfaces—use pronouns and other parts of speech is an important means of signaling status. The more pronouns used, the lower the status. This relationship correlates with gender; Hannon points to research that shows that women typically use more personal pronouns than men. And so the default female voice of Alexa and other AI interfaces, from that of Siri to a Garmin GPS, stumbles over itself, inserting extra I's to signify its lesser status relative to us superior humans. This provides a powerful insight into how we relate with objects in the Internet of Things. But new work by Donna Hoffman and Tom Novak of George Washington University  provides a complementary way to think about how AI design can shape our interactions with smart objects.
Hoffman and Novak start from a foundational idea in marketing: that theories describing relationships between people work in other domains. For example, the concept of brand relationships, developed by Susan Fournier , is helpful in understanding why some people think it's fun to buy a different brand of shampoo every time they're at the store—because they're having a series of flings!—while some Ford truck owners wouldn't be caught dead driving a Chevy pickup—because they're in a committed relationship akin to marriage.
Hannon shows how we should be conscious that it is not only the gender of the voice, but also the verbal patterns deployed in AI interfaces that can reinforce problematic gender hierarchies. Memorably illustrating the point with an analysis of dialogue from the 2013 Spike Jonze film Her, Hannon argues that to create female-voiced AI interfaces that will be perceived as higher-status, developers "should reduce the gratuitous use of I-words while at the same time increasing word patterns that indicate high degrees of cognitive processing and an awareness of social relationships." This resonates with another influential idea in marketing, which is that brands can be effectively described through measures of two dimensions: warmth and competence. These dimensions, of course, are also qualities of people. We stereotype rich people as competent, but not warm. This is similar to how we regard brands such as Mercedes or Rolex, and wholly differently from how we regard incompetent, conniving brands such as Enron or Equifax . For the most part, AI seems to assume that to find a place in the smart home, consumers will want their objects to be high in both warmth and competence. And so we end up with the feminine, compliant Alexa, who takes pains to apologize when "she" doesn't understand our request. When competence falls short, warmth becomes a useful stand-in.
The brilliance of Hoffman and Novak's work is that it recognizes the full spectrum of relationship types in addition to the master-servant relationship that underpins many of our current devices. Someone who uses Alexa or IFTTT to control a WeMo switch, for example, exemplifies a pure form of the master-servant relationship, where the technological object functions as an agent of the consumer. Hoffman and Novak's model also accounts for who's in charge in a particular interaction: the user or the object. The learning capabilities built into the Nest thermostat, for example, mean that it asserts control in the home, acting on behalf of its user. When one relationship partner is dominant and the other is willingly submissive, master-servant relationships tend to work well—provided both partners in the relationship have the same degree of other-orientation, which is typically conceptualized as warmth. Now, with the Nest thermostat, this is more than a terrible pun. If the Nest has a different idea about what constitutes a comfortable indoor temperature than its user, it's likely that the master-servant relationship will break down. This suggests the designers of interactive systems should think about whether each partner in the relationship is focused on itself or on others. Will the Nest selfishly go about the business of optimizing my energy bill with no regard to how I feel, or will it prioritize my comfort? Building on others' research, Novak and Hoffman refer to this orientation to the needs and concerns of others as communion. A relationship partner that focuses on others is seen as warm and caring, while one that is focused only on the self is seen as cold and aloof.
In forthcoming research, Hoffman and Novak explore three other major categories of relationships in addition to master-servant . In partner relationships the consumer's level of agency matches that of the object. I look with wonder on my Roomba robotic vacuum cleaner as "she"—and I realize my gendering of this object is both retrograde and enabled by a cavalcade of programming and marketing decisions—determinedly removes every trace of dog hair from my living room. Of course, I help by moving stray objects out of her way and dusting those places she can't reach. Somehow her help makes cleaning feel like less of a chore. It's because we're partners in the agentic goal of cleaning house, and we're both more than a little bit house-proud; why else would she chime so triumphantly upon the completion of her duties? In cleaning mode we share a common goal to remove dirt, and this unites us in communion. Like master-servant relationships, this type of true partner relationship tends to be positive, but every now and then I get this strange feeling while cleaning alongside Roomba that I'm doing work for a robot—especially when I have to rescue her from underneath the end table where she occasionally gets stuck, evidencing her own agentic limitations. When I become a servant to the robot, the diminishment of my own agency suggests the partner relationship Roomba and I enjoy is moving toward one characterized by detachment.
The final two relationship types are worth calling out as bad examples. Mismatched relationships are those where either the consumer or the object is in charge, and the sense of communion is not aligned. These relationships are characterized by exploitation or a complete lack of regard for the other. Perhaps the simplest example is the out-of-reach fire alarm that starts beeping at 2 a.m. This is a selfish object that cares only about getting a new battery, not about whether others are getting any sleep, or the fact that there is no ladder, broomstick, or replacement battery readily available. For obvious reasons, mismatched relationships are not durable, nor do they tend to be positive. The fourth and final type is the unstable relationship. This happens when object and consumer share the same degree of agency and are equally selfish or equally aloof. An early spoken-language interface models the problem with this type of relationship perfectly:
"Open the pod bay doors, HAL."
"I'm sorry, Dave. I'm afraid I can't do that."
If we were to follow Hannon's suggestions for increasing the sense of authority and agency in AI interfaces, we might rescript HAL's response to the much simpler "No," though one might be tempted to include a simple "I'm sorry" to increase HAL's warmth. In contrast, the analytical framework of agency and communion that Novak and Hoffman develop suggests that we reflect on the distribution of agency between a smart object and its user, and dig a bit deeper into the characteristics of lasting relationships. What did it take for HAL to get to the point where opening the pod bay doors was simply not an option? When it comes to the design of AI interfaces, many questions still need to be answered. Among them: Are we content with a Internet of Things built largely on the master-servant model? Will an over-reliance on this model keep us—or our smart objects—from reaching our full potential? Finally, if we integrate partner relationships into the Internet of Things, where the object and user share the same degree of agency, how can interfaces be designed so that they improve over time?
While you're ruminating over that, I have to go empty my Roomba.
1. Hannon, C. Gender and status in voice user interfaces. Interactions 23, 3 (May-June 2016), 34–37. DOI:https://doi.org/10.1145/2897939
2. Hoffman, D. and Novak, T. Consumer and object experience in the Internet of Things: An assemblage theory approach. Journal of Consumer Research. In press. DOI:https://doi.org/10.1093/jcr/ucx105
3. Fournier, S. Consumers and their brands: Developing relationship theory in consumer research. Journal of Consumer Research 24, 4 (Mar. 1998), 343–353. DOI:https://doi.org/10.1086/209515
4. Kervyn, N., Fiske, S.T., and Malone, C. Brands as intentional agents framework: How perceived intentions and ability can map brand perception. Journal of Consumer Psychology 22, 2 (Apr. 2012). DOI:https://doi.org/10.1016/j.jcps.2011.09.006
Jonathan Bean is an assistant professor of architecture, sustainable built environments, and marketing at the University of Arizona. He researches domestic consumption, technology, and taste. firstname.lastname@example.org
Copyright held by author
The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.