Mark Baskinger, Mark Gross
Interaction design melds traditional methods and approaches from other established disciplines. Many immediately think of digital technology or software, but the concepts of “interaction” are deeply rooted in classical industrial designproducts are designed to actively engage people and mediate their relationships with systems, activities, information, and with each other. Today interaction design includes services, systems, and strategic planning and reflects core principles of human-system/human-object interaction.
Within the diverse landscape of interaction exists a specialized area where physical form and computing combine to yield new paradigms of interaction. This area, “tangible” interaction design, broadens scope, relevance, and application, linking interaction designers more directly with product development. It provides a new arena for industrial designers, transdisciplinary thinkers, and experimenters to develop artifacts that behave, react, and/or interact with people. More than computation or embedded intelligence alone, tangible interaction design focuses on human behavior and experience that guides the design of form and establishes the roles an artifact will play. Tangible interaction designers use their work to question and reflect on the integration of technology and its effects on human experience.
The First Interaction Designers?
The Lower Paleolithic (Acheulian) hand ax (see Figure 1) discovered in northern Africa was designed with obvious intent and made specifically for a left-handed individual. It tells the story of how our hands inform our brains during interaction. Primitive humans (Homo ergaster) from eons ago were among the first to consider ergonomics and fitthey were early pioneers of good design, and yes, interaction. It was great product design and great interaction design all rolled into cutting-edge technology of the time, and it was intended to endure. This left-handed hand ax is an important example of how the physical form of our hands guides how we shape things and how we interact with and experience the world.
What products today will endure? Will the iPhone have the same impact as the hand ax? Is it tangible enough? Why do avid texters prefer phones with physical keyboards? Does a vibrator motor provide enough feedback to make a virtual keyboard seem “real”? Asking questions like these is important in designing tangible products. Such questions challenge us to reflect upon how artifacts facilitate interaction and help us connect with people on deeper levels.
Convergence, Synthesis, and New Paradigms
The hand ax reminds us that design has always been about interaction, and interaction has always been tangible. What’s new is that physical interaction is becoming computationally mediatedor conversely, that computational media are becoming physically embodied. Designers of physical things and places must consider how to embed software; designers who work in software alone must leverage the constraints and affordances of the physical world. New paradigms are arising through the investigation of new embodiments of technology that achieve seamless integration of form and interaction. Integrated form and computation that enhances our experiences with systems, objects, and places will resonate on deeper visceral levels, tapping into emotions and sparking new relationships. This integration leverages form to influence human behavior in richer ways. Adaptive, responsive, thinkingyet physicalobjects induce a dialogue through gestures and physical touch. They implore us to relate to them as if they were more “alive” than toasters and toothbrushes. Looking through this lens reveals a renewed outlook on traditional industrial design that revisits aspirations of early pioneers. It positions designers as catalytic agents for broader impact rather than mere stylists for commodities. And it shifts the design discourse again, as computation enables designers to make objects do things they simply couldn’t do before. A few years ago, interaction design meant solely screen-based digital interactivity. Today interaction designers are called on to be cross-platform, multidisciplinary problem solvers.
Tangible interaction is the physical embodiment of computation. Tangible interaction practitioners, researchers, and educators integrate knowledge from many areas. They draw upon traditional design, engineering, computing, and robotics in a mashup of skills and methodsthinking and making in physical form, electronics, and code. As the field develops, those who are adept in working simultaneously in code and in physical form will synthesize new processes for developing products. They will see and work across the old disciplinary boundaries.
Form is an important element in tangible interaction, as it visually signals and physically embodies functionality, expresses cues for understanding, and provides the script for interaction. Carnegie Mellon’s School of Design focuses on physical embodiment through projects about people’s relationship with objects and systems. Students explore the role and impacts of object forms with the intention of developing artifacts that are useful, usable, desirable, and that exhibit permanence. We encourage our students to look past cosmetic aspects of form to consider objects as communicators that will elicit a continuing dialog with people.
In one exercise, students design a set of simple hand tools that facilitate and express a function. Students structure semantic cues and form language to visually inform the mind, physically engage the body, and guide the hand for interaction (see Figure 2). Drawing inspiration from Neolithic hand tools, they consider body position, movement, ceremony, and utility to guide form development. This exercise is a primer for interaction and subsequent study in the industrial design curriculum that emphasizes the role and impact of form in shaping people’s behavior and experience.
Students are given design problems that require them to use physical form to mediate and facilitate interaction. They develop sensitivity to form through experiments with materials and purposeful play. The specimens shown in Figure 3 engage students in form development, construction, and physical manipulation. Throughout the curriculum, students engage with mechanical, embedded, intelligent, and/or adaptive systems that encourage new forms of interaction (see Figure 4). They learn to use form language, aesthetics, ergonomics, and the traditional methods of industrial design in making interactive products understandable and appropriate.
Form connects with computing through sensors and effectors. Sensors provide input. The simplest and cheapest sensor is a switch; today, buttons dominate our interaction with electro-mechanical products. Well-positioned switches can sense how an object is being held. Sensors aboundfor temperature, movement, pressure, force, moisture, chemicals, stretch and strain, and so on. Effectors provide output. Long popular as indicators, LEDs mounted beneath a translucent skin can change an object’s color. And there’s audio: Everything beeps and buzzes, but what do these sounds tell us? We could do so much more with sound design. Motors, too, are effectors, providing motion and other physical action. For example, vibrator motors in cell phones bring a physical quality to digital interaction. Touch and light make a simple and compelling combination: The Hit Me interactive light (see Figure 5) lights up in different patterns, depending on how it is touched or grasped. Less common are effectors such as the nitinol muscles in Greg Saul’s paper robots or the thermochromic paint that colors his lamps (see Figure 6).
The new element in design is software, a fundamentally abstract and disembodied way to prescribe behavior. In the past, industrial designers set the stage for interaction and behavior by making decisions about physical form and materials. Now the designer also programs the object’s interactive behavior. The simplest program relates inputs directly to outputs (“When the door is open, the lights blink”). Usually, though, software is a more subtle model of the design in use. For example, a state-machine model looks at different states of the design in use and transitions between states. In one state, a clock displays the time; in another state, it can be set. A sensor on the clocktypically, but not necessarily, a buttontriggers transitions between the two states.
Fueled by the enthusiasm of artists, hobbyists, DIY hackers and by the interaction-design community, new hardware tool kits and platforms make it easier to build and program working prototypes of products with embedded electronics. An early well-known kit was Phidgets; popular today in the design community is the Arduino family of microcontroller boards, including the Lilypad, engineered for embedding in textiles. Hardware design environments such as Fritzing and a host of programming environments such as Pd and Funnel invite designers to work directly with electronics and code. Tool kits and platforms are crucial, and as tangible interaction design moves forward, we will see more “designerly” languages and tools for hardware prototyping and programming.
Soon tangible interaction design will come to encompass not only individual products with embedded computation, but also ensembles of computational objects that work together. Take the robot construction kit in Figure 7. Microprocessors in each block communicate with neighbors, giving the robot ensemble emergent behaviors: Form and program are one. As computation becomes more deeply embedded in form, programming ensembles will become as important as programming the behavior of single objects.
Where Do Tangible Interaction Designers Fit?
Many first-generation tangibles have been whimsical and artistic explorations of what new technology can do. Some are simple; some, more complex. Some are elegant embeddings of display and projection. Some celebrate new materials. Some add sensing in clever ways. The field is still wide open, but one thing is clear: We’re likely to see more, not less, programming in things, and a lot more experimentation.
Faced with the integration of form and computation, researchers and practitioners are asking new questions about the aesthetic qualities of interaction and the impact of form on human behavior. As tangible interaction design matures, designers will focus more on the meaning and impact of form on people. This, of course, echoes traditional human-centered design approaches, but computation provides the opportunity to design adaptive, responsive, and highly interactive products and systems.
Tangible interaction designers are the new “Leonardo/Edison” types. As makers they are equally at home in the worlds of material and physical design, mechanical engineering, electronics, and programming. Already they are in demand, precisely because they transgress the traditional disciplinary boundaries that characterize (and limit) our schools and firms today. Their diverse range of ability is what enables them to be creative in this new design space. As firms embrace the new integration of form and computation, tangible interaction designers will play a more prominent role in product development, bridging the gulf between traditional design and programming.
The growing tangible interaction community meets at several conferences that explore design, technology, and societal impact. DesForM (Design of Semantics of Form and Movement) gathers academics and professionals in a forum that embraces the diversity of design approaches. It focuses on the meaning of products and how designers communicate information, functions and ideas to enable these to be perceived and understood by people in their everyday lives. TEI (Tangible, Embedded, and Embodied Interaction) is a demo-friendly conference about human-computer interaction, design, interactive art, user experience, tools and technologies. And of course there’s CHI, which in 2009 showcased tangible interaction design with a new “Design Vignettes” venue. Other pertinent conferences include DIS (Designing Interactive Systems), UIST (User Interface Software and Technology), IDSA (Industrial Designers Society of America), and DUX (Designing for User eXperience).
The first schools to embrace tangible interaction design in the 1990s included the Royal College of Art, the MIT Media Lab, and NYU’s Tisch School of the Arts. Programs have since sprung up around the world. Perhaps because of the inherently interdisciplinary nature of tangible interaction design, many universities teach this form of design in a distributed manner across schools, departments, and programs. For example, Carnegie Mellon’s new master’s of tangible interaction design program leverages the university’s strengths in design, robotics, and engineering, human-computer interaction, architecture, and the arts.
Toward a Tangible Future
We have painted a rosy view of tangible interaction design, emphasizing two different and equally important aspects of the field. On one hand, tangible interaction designers are experimenters, playing purposefully in a new space of form and computing. The vocabulary of form, function, and behavior of computationally enhanced products is still very much under construction; this yields some work that is an engineering triumph yet awkwardly made, or work that is elegant and clever but without apparent function. On the other hand, tangible interaction designers aim to make things that elegantly integrate form, computation, and behavior. Bringing these two together is the challenge for individual designers, and also the challenge for the field. The spirit of experimentation with new materials and processes is what makes tangible interaction design daring. Yet it is the seamless integration of form and computation that makes it magical. We must keep the thrill of experiment while evolving design vocabularies and processes that integrate code, product form, behavior, information and interaction. In this way, we will enable a new genre of products to achieve relevance and connect with people in meaningful ways.
Mark Baskinger is an associate professor in the School of Design at Carnegie Mellon University, where he teaches courses in industrial design with an emphasis on form and interaction. His interests include exploring new paradigms for interactive objects and interpretive environments, and methodologies of design drawing and visual thinking to promote collaboration. An international speaker and workshop leader, Baskinger also conducts “Drawing Ideas®: A Field Guide to Visual Thinking” courses in conference and business contexts where he makes design drawing methods and visual thinking techniques accessible to a broader audience and demonstrates strategies for using sketching to foster collaboration in design processes. Parallel to his appointment at Carnegie Mellon, he co-directs The Letter Thirteen Design Agency (www.letterthirteen.com).
Mark D. Gross is a professor at Carnegie Mellon University’s School of Architecture where he studies and teaches tangible interaction design. His MIT Ph.D. dissertation, “Design as Exploring Constraints,” is in design theory and methods. Gross is interested in computational tools to support designing, and ways in which computing can be embedded in things and places. He recently co-founded Modular Robotics (http://www.modrobotics.com/) to build computational construction kits for children.
Figure 1. This left-handed Lower Paleolithic (Acheulian)
bifacial hand ax found at an exposed site in northern Africa
dates back 1.2 million500,000 years ago. The purposeful
design of contoured surface and placement of indentations for the
hand demonstrates design to afford grip and guide interaction.
Figure 3. Top, forms in various materials invite touch
and manipulation (Mark Baskinger). The wooden forms were found in
a lakeside market in Taiwan. Bottom, “interactables” that
encourage form development, construction, and physical
manipulation (Mark Baskinger and Jason May).
Figure 4. “Re-routed Radio” projects by industrial
design students (a) Nadeem Haidary, (b) Josh Finkle, and (c)
Gavin Stewart. These music players were designed to establish new
forms of interaction using standard electronics combined with
non-traditional materials and expressive physical forms.
Figure 5. The Hit Me interactive lighting device
responds to touch with an LED display. It affords various
interactions with the hand. Designed by Carnegie Mellon students
Henry Julier, Justin Rheinfrank, Amanda Ip, and Michael
Cruz-Restrepo; directed by Kees Overbeeke (TU/e). Image source:
Figure 6. Interactive designs with unusual effectors by
Greg Saul (Carnegie Mellon and Victoria University of
Wellington): Left: family of paper robots, folded paper boxes
actuated with nitinol shape-memory alloy muscles. The simple
devices can be programmed to respond to light, sound, or online
chat; their motions are alternately graceful, silly, and playful.
Right: Paper lamp colored with thermochromic ink changes
appearance as its bulb heats its shell.
Figure 7. Eric Schweikardt’s toy blocks snap together to
construct working robots. Black blocks are sensors; white blocks,
effectors; and colored blocks operate on data. The configuration
of blocks determines the robot’s behavior. Modular Robotics LLC
is bringing the kit, designed at Carnegie Mellon, to
©2010 ACM 1072-5220/10/0100 $10.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2010 ACM, Inc.