FeaturesSpecial topic: live-streaming research in HCI

XXVII.1 January - February 2020
Page: 36
Digital Citation

Special Topic: Live-streaming research in HCI: Introduction

Raquel Robinson, Katherine Isbister

back to top 

Curious about live streaming? In this introduction, we’ll provide an overview of this rapidly growing phenomenon, share a bit about our own live-streaming research, and preview the five articles in this Special Topic section. In the past decade, video sharing has become increasingly popular. Platforms used for broadcasting content such as YouTube and Twitch have become widespread, frequented by millions of users every day. As a nod to broadcast television, these platforms are generally divided into channels, online profiles for streamers to post their content. Generally, each channel has videos in a specific theme, allowing audiences to tune in to the channels that fit their interests. Channels can either have prerecorded videos or broadcast live video. The option of sharing live video is a relatively new addition to these websites, dating back only to 2007.

back to top  Insights


The popularity of sharing live content in recent years is largely due to the creation of a platform called Justin.tv—started as a way for people to share general live-video content and rebranded as Twitch in 2011 [1]. Since then, Twitch has become the leading service for gameplay streaming. Gameplay streaming continues to gain traction, with more than 1.2 million average viewers per day and more than 3.7 million broadcasters per month on the Twitch platform alone [1].

However, while gaming content and Twitch are what sparked the rise of live streaming, other streamed activities are becoming increasingly widespread as well. People stream everything from art (Twitch Creative) to live-coding sessions, and talk-show-style conversations, to IRL (in real life) activities—streaming from daily life. Streaming activities are now even being integrated into social media experiences; platforms such as Facebook Live and Periscope allow for immediate live broadcasting from mobile phones. What made Twitch unique at first (but was soon integrated into all platforms) was the live chat feature. Alongside the screen is a chat window where viewers are able to comment and chat with the streamer and other audience members in real time while viewing the stream (Figure 1a).

ins02.gif Figure 1. Twitch overlay with labels. A (orange): live chat feed; B (blue): heart rate and GSR information from streamer (labeled as “sweat” for nontechnical viewers); C (red): emotion-labeling information.

A few years back, in 2015, we became interested in live streaming due to our work creating a tool called All the Feels (Figure 2), which adds the streamer’s biometric information to the gameplay streamer’s overlay. A graphical overlay for a live streamer is additional content (e.g., a “donate now” button, ads, webcam feed) placed over different parts of the screen (Figure 1a). Think of it as similar to pop-ups and banners in broadcast TV. Generally, the chat feed is off to the side of the overlay, and the streamers choose the content that they want on the overlay. Our tool, All the Feels [2], displays heart rate and galvanic skin response (GSR) information to spectators (Figure 1b), read from the player in real time through the use of a commercially available wearable device, the Empatica E4 wristband (Figure 2). The tool also provides emotion-labeling information input from commercial auto-detection software that uses the player’s webcam (Figure 1c). Players can choose from several predesigned sets of emojis (e.g., sharks, Figure 1c) that the software will then use to label their current expression, but the base labels provided by the software remain the same: joy, sadness, anger, disgust, surprise, fear.

ins03.gif Figure 2. Closeup of All the Feels. Overlay on the left with heart rate and GSR/sweat information, plus software-labeled levels of the joy, sadness, disgust, surprise, and fear of the streamer. On the right is the E4 Empatica wristband used to track the heart rate and GSR information from the player.

While prior use of physiological signals in HCI has primarily focused on tracking user/player experience toward better understanding or improving that experience, All the Feels takes a detour from this emphasis on evaluation and understanding. We are not attempting to connect physiological signals to some kind of experiential ground truth for ascertaining details of a player’s emotional experience (a goal that some in HCI problematize, in ways that are outside the scope of this brief introduction). Instead, our aim is to facilitate engagement among spectators in digital play—creating another interesting channel of information that could help build connection and community. Thus, our project did not have as its primary aim providing highly accurate information about biosignals and what they mean about player emotions. We did our best to use tools that provided reasonable accuracy in detection of biosignals, but we were more interested in developing an overlay and tool that could be broadly distributed, so we chose devices and software that were readily available to non-professionals. The goal was to explore how spectators might respond to these additional layers of information. Interestingly, given our use case of displaying these physiological signals to facilitate engagement, spectators in fact reported that the signals were interesting to them when displayed in the overlay, even if they were not perfectly accurate in every moment.

In thinking about how to best present streamers’ emotional information to spark connection with spectators, we wanted to visualize the information not just as a standard bar graph, but rather in an alternative and creative fashion that would seamlessly fit into streamers’ existing ecosystems. We therefore hired an artist to create a variety of emoji sets for the emotional overlay, so streamers could choose a set that best represented their own aesthetic, or alternatively, design and create a new one of their choosing (Figure 3). There’s still more work to be done to unpack how spectators receive this new style of biometric overlay, but one interesting finding in a preliminary study was that the tool generated a greater spectator empathetic response in women than men. Since women make up only 20 percent of Twitch users, this suggests that future design directions could productively focus on this specific user base.

ins04.gif Figure 3. Different emoji sets available to streamers to customize the spectator experience of the biometric overlay.

Audience interactivity (e.g., the live chat feed) is a large part of what differentiates platforms like Twitch and YouTube from live television broadcasts. The customizability of streamers’ channels combined with robust chat features (emoji sets, polls, Twitch extensions [1]) makes these platforms a fertile environment for social experiments. These platforms allow for experiences that could happen only due to the nature of the platform. For example, in 2014, Twitch Plays Pokemon turned the chat room feature into a game controller, allowing users around the world to collaborate to advance in the game Pokemon Red. Millions of players tuned in to type controller commands into chat, which were then converted into button commands to control an old Game Boy emulator (Figure 4).

ins05.gif Figure 4. Twitch Plays Pokemon feed from 2014. Chat feed is on the right with the username and command to control the game to the right of the username [3].

Games that use the audience’s chat input to affect gameplay are becoming more and more popular. The game Choice Chamber requires spectators to vote on certain game elements that directly impact the difficulty of the game, which means spectators play a vital role in the progression of the game [4]. Green Screening and facial recognition technology are just a couple of the many resources available to streamers to make their streams more interesting to viewers. Twitch streamer wgrates transforms himself into a dog sitting at a desk, floating in a corner of the screen. The dog’s mouth and face move in accordance with wgrates’ speech and facial expressions (Figure 5).

ins06.gif Figure 5. Popular streamer wgrates (lower left of the screen) using facial-recognition technology and character-animation software to turn his face into a dog.

The rise in popularity of live streaming is paralleled by those researching it. Some researchers study live streaming itself, while others use live-streaming platforms as a rich source for data collection (e.g., chat-data analysis). Researchers are making use of this novel new technology in creative ways, studying all parts of the phenomenon.

We (Raquel and Katherine, along with Jessica Hammer at Carnegie Mellon University) assembled a workshop, first at the Foundations of Digital Games (FDG) conference two years ago and again this past year at the 2019 CHI conference in Glasgow, U.K., to gather people working in all corners of this exciting emerging area [5]. Live streaming impacts how we as a field think about how people interact in real time around shared experiences. Lots of interesting work was shared at these workshops, and exciting questions arose relating to how HCI researchers go about studying/designing for live-streaming interaction in a variety of ways. In the five articles that make up this Special Topic, you will read some of the highlights of what was shared.

The topics of these articles cover a wide span, from video gaming to art, music, culture, and religion. The first article, by Pascal Lessel and Maximilian Altmeyer from DFKI, focuses on the gaming side of Twitch, discussing audience interactivity. The next article, by Ailie Fraser and Scott Klemmer from UC San Diego, and Mira Dontcheva and Joy Kim from Adobe Research, discusses how live-streaming creative endeavors have affected the streamed content itself. Lee Taber, Leya Breanna Baltaxe-Admony, and Kevin Weatherwax from UC Santa Cruz have written an article about content that is a hybrid between live and recorded—live music streams, often specifically “lofi hip hop” streams, generally with a set of songs and a video playing on a loop. The fourth article, by Zhicong Lu from the University of Toronto, discusses promoting cultural heritage practices in China through live streams. We finish off the Special Topic with an article by David Struzek, Martin Dickel, Dave Randall, and Claudia Müller from the University of Siegen. It’s about streaming church services to remote viewers, helping to keep the elderly population in a rural city engaged in church when they can’t be there physically.

We hope you enjoy reading about the fascinating work of these phenomenal researchers.

back to top  References

1. TwitchTracker; https://twitchtracker.com/statistics

2. Robinson, R., Rubin, Z., Márquez Segura, E., and Isbister, K. All the feels: Designing a tool that reveals streamers’ biometrics to spectators. Proc. of the 12th International Conference on the Foundations of Digital Games. ACM, New York, 2017, Article 36.

3. Twitch. TwitchPlaysPokemon, 2017; https://www.twitch.tv/twitchplayspokemon

4. Choice Chamber. Studio Bean, 2015; https://store.steampowered.com/app/359960/Choice_Chamber/

5. Robinson, R., Hammer, J., and Isbister, K. All the World (Wide Web)‘s a stage: A workshop on live streaming. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2019, Paper W23.

back to top  Authors

Raquel Robinson is a Ph.D. student at the University of Saskatchewan studying computer science. She received her M.Sc. from UC Santa Cruz, where she developed All the Feels, a tool to incorporate biometrics into online gameplay streams. She focuses on enhancing social connections through augmenting biometric data in gameplay environments. Raquel.robinson@usask.ca

Katherine Isbister is professor of computational media at UC Santa Cruz, where she directs the Social and Emotional Technology Lab. She has authored several books, including How Games Move Us, about the emotional and social connections games provide. Current research focuses on augmenting social experience both in-person and across networks with technological interventions. Katherine.isbister@gmail.com

back to top 

©2020 ACM  1072-5520/20/01  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.

Post Comment

No Comments Found