Features

XXV.5 September-October 2018
Page: 38
Digital Citation

Chatbots: Changing user needs and motivations


Authors:
Petter Brandtzaeg, Asbjørn Følstad

back to top 

Chatbots have been around for decades. However, the real buzz around this technology did not start until the spring of 2016. Reasons for the sudden renewed interest in chatbots include massive advances in artificial intelligence (AI) and a major usage shift from online social networks to mobile-messaging applications such as Facebook Messenger, Telegram, Slack, Kik, and Viber. The first of these reasons holds promise that intelligent chatbots may well be within reach. The second concerns service providers’ need to reach users in the context of mobile messaging. However, in spite of these drivers, current chatbot applications suggest that conversational user interfaces still face substantial challenges, generally speaking, as well as for the field of human-computer interaction (HCI). Chatbots imply not only a change in the interface between users and technology; they also imply changing user dynamics and patterns of use.

back to top  Insights

ins01.gif

We have previously outlined the potential implications and opportunities that chatbots hold for the field of HCI [1]. In this article, we dig into what we see as a key challenge with chatbots from a user-centered perspective. Developers and designers have an urgent need to know more about how people experience chatbots and to understand the user needs that motivate the future use of chatbots. We therefore need to ask how chatbots resonate with user needs and desires and, in turn, how these same needs and desires evolve as users get more experience with chatbots.

ins02.gif

The word chatbots is a derivative of chat robots, understood as machine agents that serve as natural language user interfaces to data and services through text or voice. Chatbots allow users to ask questions or make commands in their everyday language and to get the needed content or service in a conversational style. If chatbots gain the expected popular uptake, the technology will dramatically change the way in which people interact with data and services online. It has been predicted that such a conversational disruption may reduce service providers’ motivation to invest in apps and instead prioritize chatbots as a channel for reaching out to users. As one messaging platform provider, Kik, claims on its developer site: “First there were websites, then there were apps. Now there are bots.”


Developers and designers have an urgent need to know more about how people experience chatbots and to understand the user needs that motivate the future use of chatbots.


On average, people are spending more time on messaging platforms. Facebook Messenger, for example, is reported to have had more than 1.2 billion active users per month in 2017. Given this increase, chatbots may be a critical way to reach customers. This fundamental change in online user behavior has propelled every major company to take on chatbots. Not only the typical technology giants such as Google, Amazon, Facebook, and LinkedIn, but also consumer service companies such as Starbucks, British Airways, and eBay now aim to reach their customers through chatbots. Gartner presented as a top strategic prediction for 2018 that “more than 50 percent of enterprises will spend more per annum on bots and chatbot creation than traditional mobile app development” by 2021 [2].

back to top  Current Chatbots Often Fail

The vision of a compelling conversational interface is not easily attainable. We know that a key success factor for chatbots and natural language user interfaces is how well they can support user needs in the conversational process seamlessly and efficiently. However, while chatbots may potentially increase individual flexibility, expand opportunities for information retrieval and learning, and compensate for limitations in digital competence, current chatbots often fail [3]. One possible explanation is the difficulty in designing for open-ended conversations—chatbot capabilities seldom capture all the various ways in which the user might want to engage. There has also been a substantial technology push in chatbot development, potentially leaving users frustrated by a perceived lack of concern for how people use chatbots and for what purpose. Organizations are rushing into the space, vying to be the first to deploy chatbots in their particular service domain. In this early phase of chatbot deployment, chatbot initiatives too often aim for poor use cases, ignoring user needs and user experiences.

Anna was too human. Ikea was one of the first companies to provide chatbots in customer service. Its experience is a thought-provoking example of some of the potential challenges with which chatbot designers and developers currently struggle. For more than a decade, the chatbot Anna aimed to answer customer questions on Ikea.com. Anna was ready to listen 24/7. Her purpose was to guide customers around the Ikea website in an interactive and conversational way, adding a personal touch.

However, Ikea retired Anna in 2016, with no plans for a replacement. While the full reason is unclear, frustrating customer interactions with Anna may have played a major role in ending Ikea’s innovative chatbot adventure. As Magnus Jern, president of the mobile solutions company DMI, told the BBC, “If you try too hard to be natural, it diverts from the real purpose of [the chatbot], which is about giving the right answer as fast as possible.” Ikea’s chatbot initiative clearly struggled in terms of balancing human versus robot aspects, causing people to ask “stupid questions,” often sex related, because Anna was too human [4], according to Ikea. This is somewhat different from the experiences with other chatbot initiatives where users have complained about a chatbot experience as being overly robotic and lacking a personal touch.

The ultimate chatbot fail. However, the ultimate chatbot fail was Microsoft’s Tay, deployed on Twitter in 2016. Tay was perceived as a cutting-edge AI-based chatbot, the goal being that she would learn and mimic the personality of a 19-year-old girl through interactions with Twitter users. The problem was that Tay not only learned from well-meaning Twitter users but also from Twitter trolls, giving Tay fluency in all sorts of hateful conversation. According to Microsoft, the teaching of Tay was revealed to be a coordinated effort by trolls in specific online communities (e.g., 4chan, Reddit, and eBaumsworld), abusing Tay’s commenting skills and having Tay respond in offensive ways. Many of Tay’s comments mimicked popular repugnant memes. In less than 24 hours, Microsoft removed Tay from Twitter, but only after she had praised Adolf Hitler and used harsh language to express anti-feminist sentiment.

The stories of Anna and Tay leave chatbot developers and designers with tricky questions. Challenges concerning the automation of dialogue may be even more substantial than those we faced when designing graphical user interfaces. At first this may sound strange, as the natural language dialogue in chatbots suggests a low threshold for users to access data and services. However, whereas conversational interfaces are truly intuitive when applied to interactions between people, conversations between humans and automated conversational agents are more challenging.

ins03.gif

back to top  Chatbot Interactions With Humans

Chatbots in general, and in particular, AI-powered chatbots, need substantial adaptation and maintenance to perform their task properly. The potential unpredictable variation in user input, as demonstrated in the chatbot failures above, and what constitutes a valid chatbot response, represent substantial challenges for the HCI field. In order to develop chatbots that adapt to the needs of specific users and conversational contexts, there is likely a need for improved user and context models. Chatbots and their interactions with humans must be analyzed and redesigned, not only with concern for specific interaction sequences, but also with the aim of improving generative responses to inputs from a range of users within a variety of conversational contexts.

A poorly executed conversational user interface frustrates users, which in turn will be harmful for business. A conversational user interface also has fewer opportunities for representing the potential offerings of a service provider than does a graphical user interface. The chatbot dialogue instead needs to motivate focused engagement, delivering simple and compelling user experiences. In so doing, designers and developers need to tackle questions such as: How friendly should they make the chatbot? How fast should the chatbot respond? How humanlike or personal? What about gender; should the chatbot be female, male, or gender neutral? Should the chatbot include a talk-to-human option? Perhaps the chat element itself should be deemphasized, with a greater reliance on preset answer options? The latter approach may remove misunderstandings in free-text conversations, making the process more efficient. However, it also represents a return to the old-style graphical interface, with a limited set of predefined choices at any given time.

Despite their long history, chatbots are still in rapid development, with advances every day. We can also expect that the ways in which people will interact with conversational user interfaces in the future will change, resulting in new user behaviors as well as new social norms and user expectations. Consequently, more knowledge about chatbot experiences from an end-user point of view is needed. New user insights are critical for chatbot designers and developers, who must know the desires, needs, and practices of chatbot users. Designing an interactive technology such as a chatbot requires in-depth knowledge about why people are choosing to make use of this technology, as well as why people stop using it. It is necessary to understand the people who use chatbots, their goals, the tasks they have to perform, as well as their context of use. Goals and tasks have often been seen in relation to motivational issues. However, motivation theories have led researchers to focus on factors that inspire people to use new technologies and factors that make technology use successful over a longer time period.

back to top  New User Needs and Motivations

In our high-choice media environment, users of new technologies often exhibit a range of motivations and patterns of use. And with ever more chatbots being launched, there is an ever-increasing range of possibilities. The great variety of chatbots is exemplified in the BotList (https://botlist.co/), a website where people can find chatbots for a broad range of purposes and messaging platforms. An enormous variation in chatbot alternatives can also be seen by exploring Facebook Messenger, which now has over 200,000 chatbots [5].

One stream of chatbots allows users to complete quick and specific tasks such as checking the weather, organizing meetings, ordering food, or booking a flight. Chatbots can also help people explore online content or services. For example, Microsoft launched Heston Bot, which focuses on food and cooking opportunities as well as fashion. The global clothing company H&M launched a chatbot to provide shopping suggestions based on photos from users’ personal wardrobes.


We can expect that the ways in which people will interact with conversational user interfaces in the future will change, resulting in new user behaviors as well as new social norms and user expectations.


Another stream of chatbots supports more long-term relationships and activities such as charitable giving, civic engagement, work, fitness, and personal health. Some of these have already proven to be important co-workers in office environments, while others are “smalltalk” chatbots such as Mitsuku and Replika, which seem to support the need for connectedness. Some users find a chat with Mitsuku to be comforting, much in the same way as a chat with a fellow human can be. Users are revealing intimate details of their lives to their chatbot friend Replika, which is growing in popularity, in particular among young people between ages 18 and 25.

In general, users expect customer-service chatbots to be effective and efficient in conducting productivity tasks such as accessing specific content or helping with administrative chores; other chatbots may be used for entertainment-based and social experiences. Moreover, successful chatbots seem to inform users about what to expect from the beginning. This means they are transparent about who the users are having a conversation with—that they are interacting with a chatbot, not a human. Information about what chatbots are able (and not able) to deliver is another important factor to communicate to the user.

However, while there is some existing research into people’s uses of and motivations for using media technology, there is a dearth of research on why people use chatbots—or stop using them. According to the uses-and-gratifications perspective [6], people’s motivations to use media are based on social and psychological factors. People are found to use media technologies strategically by employing different technologies for different purposes. Thus, media users select among media technologies based on how well a certain media form helps them meet specific needs or goals. A fundamental notion in the uses-and-gratifications perspective is that people are motivated by a desire to fulfill certain needs. Therefore, the key is to ask not how a particular media use influences users but rather how users’ needs or requirements influence their particular media choices. These choices are found to be motivated by several basic needs, such as entertainment, social connection, identity, and information.

So, do chatbot users use media in a completely different way from previous media users, and are user needs shifting with changing user interfaces? Our own recent research on motivational factors of chatbot users in a U.S. sample [7] found slightly similar basic needs among chatbot users and those found in more traditional media use studies, albeit with some interesting differences. The majority of chatbot users in our study reported their main motivation for using chatbots was the effective and efficient accomplishment of productivity tasks. Participants typically reported using chatbots to obtain assistance or information. However, some users also reported entertainment or social and relational factors as their main motivations for using chatbots. They described chatbots as a fun or entertaining means to kill time. Some of the participants addressing social and relational factors even reported that chatbots could help reduce loneliness or enable socialization, though they of course knew that chatbots are not people. A final category of users highlighted the novelty aspect, emphasizing an interest in trying out new technology.

The productivity aspect concerning efficiency and effectiveness is key for chatbot users, who need to feel in control of the technology. More concretely, they want to be able to have particular tasks done quickly and efficiently. This might have to do with the additional time demanded of users and people’s expectations in an accelerating digital society. Therefore, the main gratification for the typical chatbot user is to make life easier and more productive.

Chatbot users are not necessarily looking for social experiences with family and friends, which they find through other media channels. Rather, they may crave a means of sharing personal information without fear of judgment. Thus, the experience of connectedness is not necessarily about other people, but rather about being connected to oneself. A chatbot may serve a need for connection and support 24/7, which may not be available through friends and family. Research long ago [8] revealed the tendency for people to open up more easily to a computer or a technology than to humans. It is safer for people to participate in self-disclosure with a chatbot. However, we know less about what such relationships with chatbots can lead to in the long run—whether they can generate increased loneliness or depression or whether they can be balanced in a way that can be beneficial to users’ mental health.

Another important lesson learned is that chatbots are not a one-solution-fits-all technology. People have multiple motivations, and the purposes for using a chatbot can vary enormously. As such, there is a need for an appropriate range of use cases in the chatbot context.

back to top  Conclusion

Designing and developing chatbots is about understanding new user needs and motivations, which is required to make successful automatic conversational interfaces. Such conversational user interfaces to data and services means a dramatic shift in how designers and developers are used to thinking about interaction and user needs. Chatbots are changing user behavior as well as user needs. They are also altering particular user interfaces, creating new demands in the field of HCI. We have here outlined some of these new needs and challenges posed by the emergent trend of chatbots.

back to top  References

1. Følstad, A. and Brandtzaeg, P.B. Chatbots and the new world of HCI. Interactions 24, 4 (2017), 38–42.

2. Gartner top strategic predictions for 2018; https://www.gartner.com/smarterwithgartner/gartner-top-strategic-predictions-for-2018-and-beyond/

3. Luger, E. and Sellen, A. “Like having a really bad PA”: The gulf between user expectation and experience of conversational agents. Proc. of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2016, 5286–5297.

4. Wakefield, J. Would you want to talk to a machine? BBC. Aug. 4, 2016; http://www.bbc.com/news/technology-36225980

5. Statista 2018; https://de.statista.com/statistik/daten/studie/662144/umfrage/anzahl-der-verfuegbaren-chatbots-fuer-den-facebook-messenger/

6. Katz, E., Blumler, J.G., and Gurevitch, M. Uses and gratifications research. The Public Opinion Quarterly 37, 4 (1973), 509–523.

7. Brandtzaeg, P.B. and Følstad, A. Why people use chatbots. In Internet Science. INSCI 2017. Lecture Notes in Computer Science, vol 10673. I. Kompatsiaris et al., eds. Springer, Cham, 2017, 377–392.

8. Weizenbaum, J. ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM 9, 1 (1966), 36–45.

back to top  Authors

Petter Bae Brandtzaeg has a Ph.D. in media and communications from the University of Oslo. His main research interests are HCI and patterns of use and implications of new information and communication technologies. He coordinates the Social Health Bots project on chatbots for health services, supported by the Research Council of Norway. pbb@sintef.no

Asbjørn Følstad has a Ph.D. in psychology from the University of Oslo and is currently senior researcher at SINTEF. His main research interests are human-computer interaction and human-centered design. He coordinates the Human-Chatbot Interaction Design project and Chatbots4Loyalty, both supported by the Research Council of Norway. asf@sintef.no

back to top 

©2018 ACM  1072-5520/18/09  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.

Post Comment


No Comments Found