Gadgets '06

XIII.4 July + August 2006
Page: 24
Digital Citation

The music is the message


Authors:
Matt Jones, Steve Jones

back to top 

Lost in Music? Picture this. A young woman, head back, swaying gracefully, lost in the music being played by her tiny mobile audio appliance. She's standing waiting for a train. Other commuters swarm nearby, headphones on, listening in more muted ways to the soundtracks they've chosen to accompany today's episode of their daily routine.

Mobile music players—iPods, Walkmans and the like—are exciting for those of us involved in thinking about the future of mobile devices and services. Like the mobile phone—increasing numbers of which also sport a music player—these are gadgets that people actually desire, and aspire to own. They have vast storage capacities, increasingly sophisticated processors and pleasing, effective mechanisms for users to control their functionality. From click and jog wheels to high-resolution color displays, these players are far from being the impoverished devices that typified mobile platforms just a few years ago. And there are hundreds of millions of them in circulation—Apple sold five million in the first quarter of 2006 alone.

But at the same time, these marvelous mobiles could be seen as an obstacle to the bright futures dreamt up in the research labs. How will people be able to make use of the ubiquitous computing infrastructure being envisaged, the anytime, everywhere access to information and services, when they are immersed in digital sound bubbles of their own choosing?

Some mobile listeners, at first sight, certainly appear cut off. Another scene: A young man stares out of a train window, watching the long green fields flash by. His eyes are bright but still; his gaze betrays a mood of deep reflection. Listening to music on expensive, noise-reducing headphones, what is he thinking? Is he sadly contemplating a lover left or anticipating a longed-for meeting at his destination? How can we design mobile services that will fit this context? The conventional clumsy beeps, vibrations, and attention-seeking displays seem out of place. What opportunities are there to build more emphatic systems, systems to enhance the music and environmental experience or to provide ways for the listener to later reflect on the situations they've lived through in relation to the music they've heard?

At other times, though, these sound bubbles seem more permeable, don't they? One of the photos of people listening to music available on a popular image-sharing site is entitled "Listening to Kim's iPod and talking to Joe!" A teenager holds a mobile phone to her left ear while wearing the distinctive white music-player headphones. Chatting and listening, the cocktail-party effect played out in a 21st-century context. And, what about those groups of trend-conscious youths, swaggering with try-hard confidence down our city streets? Look closely; they are joshing loudly with each other, but each is also half-listening to a music player, one headphone in, the other dangling. Older users (like us) also seem adept at combining the physical and digital worlds—listening while waiting for a train or plane, we can quickly remove and replace the headphones to monitor announcements. Are there ways of making use of these practices of mixed listening, allowing mobile services to permeate the bubbles of sound?

Instead of viewing the music player as a competitor for our more sophisticated, highly featured mobile services, perhaps a good starting point is to think how we might use existing music products and practices. In the rest of this article we give some examples of prototypes we've been building that look at the role of music as a mediator for contextual information—that is, systems that continuously adapt the listener's own audio choices to provide them with a richer understanding of the environments they pass through.

Finding a Way. There's been much interest in developing mobile and ubiquitous services that are context-aware [3]: from tourist guides that use the person's location to customize its content to match-making gadgets that alert the carrier when a potential new friend is near. Despite the enthusiasm by researchers, there is still much work to be done to develop systems that can make sense of context and communicate it effectively to users. As Dourish notes, "Context is a slippery notion" [4]; just coming up with a workable definition of it is hard, let alone building sensors, models and interaction techniques.

Location—the "where" of interaction—is one important aspect of context (although as Abowd and Mynatt point out, there are interesting and challenging possibilities with respect to other elements relating to the "what," "why," and "when" of system use [1]). One location-based task that has received attention is pedestrian navigation. A range of mobile supports have been proposed, including speech-based audio cues and small-screen map representations for handheld use. What about, though, altering a listener's music to help them reach their destination? Here's a sample scenario:

Ben is going to meet a friend in a restaurant across town. After alighting from the tram station closest to the restaurant, he puts on his headphones and turns on his portable music player to hear a new song he's just downloaded. At the same time, he turns on the navigate-by-music feature; the restaurant location was copied from the email his friend sent him.

He's fairly certain of the direction in which to head and, as he moves off, the music playback is clear and strong, at a normal volume and through both headphones. As he approaches a crossroads, Ben perceives the music shifting slightly to his left, with the volume decreasing. He crosses the road and heads in that direction; the music is now balanced and the volume returns to normal. As he walks farther along the street, he notices the music beginning to fade, the volume decreasing; he looks around and notices he's just passed his destination.

We've built a prototype system to explore such navigation-by-music [6]. Figure 1 shows it in use; built on relatively cheap, standard hardware—an iPaq handheld computer that is connected to a unit that provides location information (via GPS) and directional data (via an electronic compass)—it continuously adapts the music to point the listener in the right direction.

Routes are plotted as a series of audio beacons. Now imagine yourself directly facing one of these, walking toward it: You'll hear the music played back through both headphones. Move off path, though, and the music is panned in the opposite direction. Initial trials of the prototype are encouraging—despite the comparative low fidelity of our approach, people can navigate fairly complicated routes, as shown in Figure 2.

Mobile designers can be easily seduced into thinking that their gadget or service will be the most important tool a person can carry. However, it is always worth remembering that people are "ecological" rather than "technological"—they are adept at using a variety of resources, digital and physical, to achieve goals, and we have to design with this in mind. In our navigation studies we saw the way people combined the audio cues with visual ones, pathways and dead ends, for instance, to help them decide which way to turn. There were also cases in which a false visual landmark led people astray (see Figure 3).

Two related implementations have provided additional evidence that music can move people: Richard Etter developed the Melodious Walkabout system [5] and Steve Stachan and colleagues, gpsTunes [8].

In developing the concept we started with a lab-based simulation to compare music navigation with more conventional approaches such as the use of maps [9] and spoken cues. We then built the field prototype as mentioned above. However, there is still much work to be done in understanding the roles the approach might take.

An obvious concern is the degree to which such adaptations affect the enjoyment of listening to music; Etter's evaluation found that study participants were more enthusiastic about the approach when they used their own choice of music, but what will the long-term effects be? Then there's the question about the sort of navigation to which the approach is best suited. Sometimes people have to urgently get to a destination in a completely unfamiliar area; perhaps such an ambient approach is less appropriate here. In other cases, a person might know roughly where they need to head and want a form of reassurance, to be nudged back on track if they start to get lost.

Of course, navigation from one fixed point to another is just one possibility. What about using music to help people rendezvous with each other? Or, think about the mobile action games that have been proposed, with players running around city streets [2]—the gaming soundtrack could be adapted to give cues as to the location of and danger posed by competitors.

Finding Out. Panning the music is not only helpful for navigation. We've also been experimenting with a simple set of music cues aimed at making a mobile listener aware of something of interest in their environment. So we have a "look around" cue that pans the music quickly several times to the left and right headphone; this is a prompt for the user to glance in both directions and notice an object of interest such as an interesting building, café, or visitor attraction. Panning just to one ear is a way of prompting them to look to either their left or right.

This system also has a mode to help us find out more about what users might be interested in being notified about. Panning cues are generated at random points during a route, and when a participant responds to one, we can ask them what they notice about the location highlighted by the shifting audio.

Music is an exciting experimental mediator for contextual information because of the richness of the vocabulary of possible signal adaptations. Spatial manipulations are a start but the quality of playback can also be modified—walking toward a rough neighbourhood? Perhaps the playback becomes tinnier. About to enter a crowded street? The bass is emphasized or the playback has added noise.

Conclusions. People like to listen to music while on the move. Since the inception of the first tape-based Walkman, Sony has sold more than 150 million personal music players. Millions of songs are purchased and downloaded from digital music stores daily, providing for a musical background to everyday activities.

Mobile music devices are powerful, pervasive, personal technologies. Increasingly, manufacturers will add features—such as wireless music purchase—to differentiate themselves from competitors. We're proposing, though, a more offbeat way to make use of emerging technologies. Music is a potent medium, able to evoke strong emotions and trigger memories. Maybe it has a role as a mediator of contextual information relating to the physical spaces we move through.

Sit in front of your desktop computer. Point and click. Drag and drop. Or take a walk with a mobile device. Again, you'll experience the discrete, event-based style of interaction: The system wants to alert you, so it buzzes, beeps, or vibrates, and you respond. Perhaps our technologies are sampling our lives and let us sample life at too low a rate. Meanwhile, as we listen to music, we go with the flow. Perhaps thinking about using signals like audio will open up new, richer, continuous ways for us to interact. Play on.

Acknowledgements The enhanced music prototypes are being developed by Stephen Pike and Dave Arter at Swansea University, inspired by earlier work by Tim Barnett at Waikato University. Several colleagues in New Zealand have helped substantially with aspects of the navigation work. Thanks to David Bainbridge, Tim Barnett, Gareth Bradley, Geoff Holmes, and Nigel Warren.

back to top  References

1. Abowd, G. D. and Mynatt, E. D. (2000). Charting past, present, and future research in ubiquitous computing. ACM Transactions on Computer-Human Interaction 7(1): 29-58.

2. Crabtree, A., Benford, S., Rodden, T., Greenhalgh, C., Flintham, M., Anastasi, R., Drozd, A., Adams, M., Row-Farr, J., Tandavanitj, N., and Steed, A. 2004. Orchestrating a mixed reality game `on the ground'. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vienna, Austria, April 24 - 29, 2004), 391-394. CHI `04. ACM Press.

3. Chalmers, M. (2004). A Historical View of Context. Computer Supported Cooperative Work 13(3-4): 223-247. Kluwer Academic Publishers.

4. Dourish, P. (2004). What we talk about when we talk about context. Personal and Ubiquitous Computing 8(1): 19-30.

5. Etter, R. (2005) Melodious Walkabout — Implicit Navigation with Contextualized Personal Audio Contents, Adjunct Proceedings of the Third International Conference on Pervasive Computing, ISBN 3-85403-191-2

6. Jones, M., Bradley, G., Jones, S. & Holmes, G. (2006). Navigation-by-music: an Initial Prototype and Evaluation. Proceedings of the International Symposium on Intelligent Environments, 95-101 (ISBN:1-59971-529-5). Cambridge, UK, April 2006. Microsoft Research. UK (April 2006).

7. Östergren, M. & Oskar, J. (2006) Car Drivers using Sound Pryer — Field trials on Shared Music Listening in Traffic Encounters in "Reinventing music: Social and cultural impacts of new music technology." (Kenton O'Hara and Barry Brown, Eds.) Kluwer Academic Press.

8. Strachan, S., Eslambolchilar, P., Murray-Smith, R., Hughes, S., O'Modhrain, S. (2005) GpsTunes: Controlling Navigation via Audio Feedback. Proceedings of the 7th international conference on Human computer interaction with mobile devices & services, 275 - 278. ACM Press.

9. Warren, N., Jones, M., Jones, S. and Bainbridge D. (2005). Navigation via continuously adapted music. Extended Abstracts, ACM Conference on Human Factors and Computing Systems (CHI `05), Portland, Oregon, USA (April 3-7th 2005), 1849-1852, ACM Press.

back to top  Authors

Matt Jones
Swansea University
always@acm.org

Steve Jones
Waikato University
stevej@cs.waikato.ac.nz

About the Authors

Matt Jones is helping to set up the Future Interaction Technology Lab in Swansea University, Wales. He is the co-author (with Gary Marsden) of Mobile Interaction Design (John Wiley & Sons, 2006). More information at www.undofuture.com.

Steve Jones's research focuses on access to, and navigation within, digital and physical information spaces. In addition to navigation-by-music he is currently investigating multimedia digital libraries on iPods and automatic signage recognition and interpretation on mobile devices. And, no, Matt and Steve are not related.

back to top  Figures

F1Figure 1. Navigation-by-music prototype in use. The system employs an HP iPaq hx4700 Pocket PC, connected to a Garmin eTrex Summit-a Global Positioning System (GPS) device that includes a magnetic compass and a serial connection through which GPS and heading data are output at two-second intervals.

F2Figure 2. Route traces. Each line represents the route taken by a participant using the navigation-by-music system from a start point (green circle) to the end region (marked with the red circle). The yellow circles represent intermediate waypoints that guide the user along the route.

F3Figure 3. Some participants are led off-track by a visual distraction (a goal post on a sports field); the music panning is attempting to lead them toward the penultimate audio beacon (yellow circle enclosing a red square in the bottom middle of the figure).

back to top  Sidebar: Enhancing Mobile Music

While this article has focused on using audio as a means of communicating contextual cues to a mobile user, listening to music is an important activity in its own right, and there are ample opportunities to enhance it.

Figures 4 and 5 are screen shots from two preliminary prototypes. In the first (Figure 4), the mobile player automatically records information about where music tracks are played. Later, the play history can be visualized geographically, allowing the user to access their music by locations (perhaps they play particular music while at the gym or walking through the park?); or to reminisce and reflect in richer ways about locations they've visited.

In the second (Figure 5), the system can track the journeys of series of users and the music they listen to during those routes. These "play routes" can then be used in different ways: cross someone else's path and the music they were listening to—or something similar in your mobile library—could be added to your playlist; or, perhaps you'd like to "tune in" to these routes to hear what other people were listening to as they passed through the location in a similar way to that seen in the SoundPryer system [7].

F1-4Figure 4. Music and location. Red pins indicate location where music device has been used. Selecting a pin displays additional meta-data about the track. Prototype uses Google Maps API and data.

F1-5Figure 5. "Play routes." System tracks routes and music listened to during these journeys to support collaborative music experiences. Prototype uses Google Maps API and data.

back to top 

©2006 ACM  1072-5220/06/0700  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2006 ACM, Inc.

Post Comment


No Comments Found