How was it made?

XXV.2 March-April 2018
Page: 12
Digital Citation

Collidoscope


Authors:
Ben Bengler, Fiore Martin, Nick Bryan-Kinns

back to top 

Describe what you made.

Collidoscope is an interactive, collaborative musical instrument that allows users to seamlessly record, manipulate, explore, and perform real-world sounds.

Using built-in microphones, players can record sounds (e.g., their voice) into Collidoscope and then explore these sounds using large sliders alongside the displayed waveforms. In this way, players can move through the sounds, play them back at different speeds, freeze them at a particular position, loop parts of them, or layer sound snippets on top of each other. This results in novel sound textures and timbres, which can then be played via the keyboard, allowing for both musical and explorative interaction.

ins01.gif Collidoscope's rotary sound slider.

What for you is the most important/interesting thing about what you made?

The key aspect of Collidoscope is that it makes the fascinating audio-manipulation technique granular synthesis, typically used in expert software packages (e.g., for very fine-grained pitch and time adjustment or for creative effect), accessible to non-expert users (e.g., museum audiences). This allows them to explore and experiment with real-world sounds in novel and intuitive ways by combining direct haptic control with matched visualizations that closely represent the underlying sound-manipulation process.

ins02.gif Setting up Collidoscope at the Victoria and Albert Museum, London.

Briefly describe the process of how this was made.

Being big fans of granular synthesis, we toyed with several ideas to turn this technique into a real-time performance instrument for quite awhile. The key idea was always to allow for direct, palpable exploration and manipulation of real-world sounds. The first iteration, built at a Music Hack event in 2015 (photo mid-right), used a laptop running an audio engine developed in Supercollider. Sounds could be recorded and manipulated via a long sensor strip connected via a microcontroller, allowing people to "slide" through the sounds according to the touch position visualized by an adjacent LED strip. Though we liked the sonic possibilities and flexibility of the sound manipulation, LED feedback, and the resistive sensor strip, the force required to operate it made it feel clumsy and somewhat choppy. This led to the idea of developing a mechanical slider-knob combination, allowing a "drive" trough to manipulate the sound with smooth movements and a highly tactile feel. Running on aluminum rails, the mechanism is based on sliding elements made of Acrylic and Delrin, a low-friction material. This enables the activation of the linear position sensor via the spring-loaded wipers. These haptic controls were complemented with real-time visualizations of the sound-manipulation process, aiming to display what is going on at any point during the interaction.

As the project developed toward a self-contained interactive installation, the audio and graphics engines were completely redeveloped in C++/Cinder to run on a single-board computer (SBC) such as the Raspberry Pi to allow for embedded operation. Sliders and hardware for the user interface are connected through a generic USB HID device, making them fully independent from the SBC platform.

ins03.gif First exploratory prototype.

Was there anything new for you in the making process or materials that you can tell us about?

One of the biggest challenges in the making process of Collidoscope was that despite its size, we wanted it to fit in only two trolley cases, meeting the standard size requirements for checked baggage on flights. This led to a highly modular approach (photo mid-left), with a multitude of small units that had to be assembled and disassembled on site. This design requirement put extra strain on the physical build that needed to be completed in only a couple of weeks.

ins04.gif "Granulizing" recorded sounds with Collidoscope.

What was the biggest surprise in making this?

While Collidoscope proved to be a very successful public installation shown at major international art venues and festivals (e.g., Ars Electronica, Sonar Festival, Victoria and Albert Museum), it was the reaction of the online audience that was truly surprising. In November 2015, a music technology blog published a video of Collidoscope via its Facebook channel. The video instantly went viral (see link below) reaching four million views in the first 24 hours; to date it has reached an online audience of more than 14 million.

ins05.gif Two players "jamming" with Collidoscope at the Sonar Festival.

back to top  Authors

Ben Bengler, University College London, Intel ICRI Cities/UCLIC, [email protected]

Fiore Martin, Centre for Digital Music (C4DM), Queen Mary, University of London

Nick Bryan-Kinns, Centre for Digital Music (C4DM), Queen Mary University of London

back to top  Footnotes

http://collidoscope.io/

https://code.soundsoftware.ac.uk/projects/opencollidoscope

https://www.youtube.com/watch?v=9XMfKYVu_fg

back to top 

Copyright held by authors

The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.

Post Comment


No Comments Found