People: on the edge

XII.3 May + June 2005
Page: 58
Digital Citation

The robots are coming


Authors:
Lars Holmquist

"Robots." For most of us, the word still conjures up images of clunky tin men and scary artificial intelligences, or—a little more favorably—the cute but basically useless robot dogs and automatic vacuum cleaners that are slowly starting to populate our homes. But as the last issue of <interactions> shows, robots are coming, and interaction designers will have to deal with them sooner rather than later. Robots will not just be amusing toys or specialized factory installations, but will appear in situations where we would never even imagine a robot today.

In fact, we will most probably not even think of them as robots. Consider that one of the most influential recent developments in human-computer interaction has been tangible interfaces—that is, computer input and output that is not just based on the standard screen, mouse and keyboard setup but instead involves a variety of physical and tangible input and output devices. If these interfaces are to support truly tangible output, they must be able to move or otherwise affect the world around them. And what else could we call a physically actuated, computer-controlled entity but a robot? Some recent examples of tangible interfaces are already, by all intents and purposes, robots. For instance, the MIT Media Lab’s ToPoBo system is a kind of physical construction kit, which lets users construct complex animal-like shapes. The models can then be taught simple behaviors so that they can perform different movements and even walk.

My own research group, The Future Applications Lab, is currently engaged in a European research collaboration on robots, Embodied Communicating Agents, or ECAgents for short. Most of this project is about interaction design at the most fundamental level; the researchers hope to construct autonomous devices that communicate with each other by building their own language from the bottom up, without human interference. Sound crazy? You haven’t heard half of it. In Paris, the Sony Computer Science Lab is conducting a "playground experiment" where robot dogs in a playpen communicate with each other through noises and gestures. The hope is that the robots will eventually evolve their own language, much like children in a real-world playground. At the Institute of Cognitive Sciences and Technologies in Rome, simulated robots solve complicated tasks in their own way, by evolving visual and auditory signaling systems that have no human precedent. The researchers let the robots invent practical solutions to a problem and then observe the resulting behavior, much like a biologist tries to understand communication between animals. And at the Swiss Federal Institute of Technology in Lausanne, hordes of "S-Bots" spontaneously hook up to each other to form larger "Swarm-Bots" that can solve problems that are too grand for a single robot. These `bots are still a little too big and unreliable to do anything particularly useful, but soon we might see them swarm over inaccessible areas to perform rescue operations or perform other actions that are impossible for humans.

Basic research in robot communication is quite different from designing robotic products for real users. Fortunately, some interaction designers are already taking an interest in robots. For instance, in the People and Robots project at Carnegie Mellon University, interaction designers and robot researchers are exploring "robotic products," which they hope will be "intelligent, social, and able to assist us in our day-to-day needs." Carl diSalvo, a CMU Ph.D. student, has done some interesting work where ethnography and interaction design concepts were applied to robots. Taking existing robot research as a starting point, he studied a robot group at CMU as they constructed and deployed a guide robot for a conference. It turned out that the robot was not particularly good at following social protocol that is second nature for the rest of us, and that some of the robot’s behaviors that observers found most interesting and "human" were in fact errors—such as cutting the queue in front of a professor from a rival research lab! To find opportunities for new products, diSalvo also studied how technology currently fits in the everyday home. By letting people comment about the technology they currently use and how a robot would fit into everyday life, the researchers uncovered interesting opportunities. Through cultural probes people were encouraged to photograph "things that are like robots," which turned out to include computers and vacuum cleaners; "things that are not like robots" included a chair and a washing machine. diSalvo suggests that there are opportunities for robots that support social and emotional communication and awareness, or improving care of the self. An example produced at CMU is "The Hug," a conceptual robot product that can receive and give hugs over a distance, so that two people can communicate emotionally and intimately while being apart.

At the Viktoria Institute we recently organized the workshop Designing Robot Applications for Everyday Use. Approximately 15 participants were an interesting mix of robot researchers and interaction designers who came from both industry and academia. The event served to spotlight the imminent convergence of robotics and other areas, including interaction design. For instance, Shaun Lawson from Napier University presented a project where the human-dog interaction, including the uncanny abilities of dogs to predict epileptic seizures, is studied in the hope to create future assistance robots—perhaps embodied in the shape of a robot dog. Rutger Menges from Eindhoven University of Technology talked about an experiment in "cruelty towards robots," which explored if Reeves and Nass’ media equation holds for robots when one side of a classic psychology experiment was replaced with a robot (it didn’t quite hold in this case—participants were prepared to be cruel to a Lego robot to the point of breaking it!) And Helge Hüttenrauch of the Royal Institute of Technology showed how hard it is to bring the human-computer interaction perspective into traditional robot research.

A goal with the workshop was also to explore possibilities for designing new and innovative robot applications. In a brainstorming session, participants first generated the basic components of an application concept: robot types, robot properties, environments and tasks, and users. These were then jumbled randomly, one from each category, to create seeds for application brainstorming. The randomly generated combinations were turned into applications so that, for instance, unlikely applications like "a submarine, waterproof robot that helps military personnel in a wardrobe" became a support robot that manages diving equipment, and an "aerial-shaped robot with the ability to detect vanity and aid jealous husbands and wives while dating" became a robot spy. Participants selected their favorite application concept to flesh out and build a rough model to show how it would work. The final concepts included a hovering robot assistant to manage fear and anxiety at amusement parks; a traveling robot buddy that would inform and amuse during car trips; and a system of robotic plants in a public space, such as an airport, that would autonomously move to positions to better accommodate the flow of people and alleviate queues. While the final designs were perhaps not always realistic as products, the brainstorming was successful in that it allowed people from different backgrounds to work together in defining and expanding the boundaries of robotic products for everyday users.

Meetings between robots and interaction designers will become even more frequent in the future. While Web pages and GUIs will always be an important part of the profession, designers who are aware of the potential of physically actuated products—call them robots or something else—will have a clear advantage over those who stay with the purely visual modes of interaction. The skills of interaction designers will be vital to create robotic products that are not just shaped by technological possibilities but also take into account users’ needs and abilities. Who knows—perhaps tomorrow’s computer mouse will not be content to lie flat on the desk, but will scuttle around on its own accord, chased by frustrated users!

URLs

MIT Media Lab, ToPobo—A Constructive Assembly System with Kinetic Memory http://tangible.media.mit.edu/projects/topobo
European Community research project, "ECAgents: Embodied and Communicating Agents" http://ecagents.istc.cnr.it
Carnegie Mellon University, "Project on people and robots" www.peopleandrobots.org
Viktoria Institute workshop, "Designing robot applications for everyday use" www.viktoria.se/fal/events/robotworkshop

Author

Lars Erik Holmquist
leh@viktoria.se

About the Author:

Lars Erik Holmquist is leader of the Future Institute in Goteborg, Sweden. Before this, he founded and led the PLAY research group from 1997 to 2001. He is interested in innovative interactive technology, including tangible interfaces, informative art, mobile media and autonomous systems. He was general chair of UbiComp 2002, the international conference on ubiquitous computing and is an associate editor of the journal Personal and Ubiquitous Computing.

©2005 ACM  1072-5220/05/0500  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.

 

Post Comment


No Comments Found