As I write this, I am a guest researcher at Nottingham University in the U.K., staying at a grungy teacher’s flat. In such circumstances, having access to your own music makes life a lot more agreeable, but of course I could not bring my CD collection with me. Fortunately, by buying some small speakers and hooking them up to my iPod, I can now listen to my records even though I am hundreds of miles from where the discs are physically located. For music, the physical storage medium is becoming increasingly irrelevant, being replaced by devices that access bits on a hard-drive or streamed over a network.
A combination of growing disc capacities, compression algorithms, and increasing bandwidth means we can have an almost limitless supply of music just about everywhere. By replacing corporate muzak and conservative radio schedules with portable MP3 players, online music stores, file sharing, ringtone downloads, and celebrity playlists, we herald the age of ubiquitous music. But as music becomes ubiquitous, how do we exploit the capabilities the new technology makes possible? And what should be included in the interaction designthe buttons on the physical box, the GUI, or the entire experience?
Interaction design used to be about graphical interfaces, but when technology gets more personalized, it becomes less possible to stick to the traditional divide between software development and product design. Whereas the physical interface to the workstation has become standardized in the last 20 years, the field is still wide open for computational devices off the desktop. These products will be much smaller and less expensive than PCs, allowing for a lot more customization in hardware design.
Apple’s iPod is a good example of this trend, as it successfully combines physical and virtual form. Whereas most manufacturers buy the hardware components or entire players from Asia and customize them with local software, the iPod is designed for consistent interaction from the bottom up. Crucially, this integration is not limited to the device itself; it entails the entire experience. The design of the iPod extends into cyberspace, incorporating the iTunes software that lets you transfer songs, buy new music, and share files with others in the same network. This seamless extension of interaction design from physical buttons, over on-screen GUI, to the Internet and beyond is an indicator of things to come.
The iPod is an example of how the shape of the physical gadget serves to mediate a media experience that involves communication, content, and creativity. Other companies are surely waiting in the wings to introduce products that serve to deliver a similar mix (and the smart money is not so much on Microsoft as on Japanese and Korean mobile-phone manufacturers). But this is just the first step. By outfitting a portable media player with an extra sensor here, a little bit of networking there, researchers are already creating a ubiquitous music experience that makes the iPod feel as quaint as a cassette Walkman.
Profound music experiences are often collective, be it playing in a band or dancing at a party. Yet a lot of music is experienced in isolationif we turned up with a noisy Boombox on the subway, the other passengers would not be happy. New technology can break this isolation. For instance, tunA from Medialab Europe lets you eavesdrop on what people in the vicinity are listening to. The system, which runs on Wi-Fi-enabled PDAs, is a shared-music experience where each user effectively becomes their own little radio station. Similarly, Sound Pryer from the Interactive Institute lets people listen in on the music playing in passing cars on the road. Unlike iTunes, these systems create a shared experience by allowing real-time synchronous listening rather than just providing read access to other people’s files. They point to how the physical and social environment is becoming an increasingly important resource in interaction design.
Another way of expanding the experience is to use sensors to detect changes in the environment and adapt the music accordingly. At the Future Applications Lab, we developed Sonic City, a wearable context-aware music generator. A microphone takes in sounds from the city environment, which are transformed into music based on input from sensors, including an accelerometer, a metal detector, a light sensor, etc. The timbre changes if it is a bright day or you pass under a dark overpass; the tempo depends on walking speed and turns; and the presence of metal introduces new modulations to the sounds. It makes the wearer create music in a duet with the city itself. Ultimately, the system becomes a musical instrument that lets you creatively exploit your environment, so that an ordinary walk to work becomes the score for a new composition every morning.
The field of ubiquitous music is growing rapidly and engaging many stakeholders, from device manufacturers to sociologists, from network operators to musicians. There have already been two successful workshops on Mobile Music Technology, which attracted a mix of industry and academia. The SIGGRAPH conference, traditionally dominated by 3D graphics and Hollywood special effects, is organizing a panel on ubiquitous music. But the biggest revolution may be happening in a pocket near you. As mobile phones and other devices reach the capacity necessary to allow people to buy, share, and communicate musical experiences, music will be more ubiquitous than ever before. And as music is increasingly accessed through computers of various shapes and sizes will album-cover art be replaced by interaction design?
Sonic City: http://www.viktoria.se/fal/projects/soniccity/
Sound Pryer: http://www.tii.se/mobility/soundpryerpresentation.htm
International Workshop on Mobile Music Technology: http://www.viktoria.se/fal/events/mobilemusic/
SIGGRAPH 2005 panel: Ubiquitous Music: How Are Sharing, Copyright, and Really Cool Technology Changing the Roles of the Artist and the Audience?: http://www.siggraph.org/s2005/main.php?f=conference&p=panels&s=music
Lars Erik Holmquist
About the Author:
Lars Erik Holmquist is leader of the Future Institute in Goteborg, Sweden. Before this, he founded and led the PLAY research group from 1997 to 2001. He is interested in innovative interactive technology, including tangible interfaces, informative art, mobile media and autonomous systems. He was general chair of UbiComp 2002, the international conference on ubiquitous computing and is an associate editor of the journal Personal and Ubiquitous Computing.
©2005 ACM 1072-5220/05/0700 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.