The installation assigns an instrument to each person in the room. Depending on where you are, a different effect applies to your sound, for example making it sound distorted or full of echo. Combining multiple people in the room, each with different sounds, every second produces a new and unique kind of music. Unlike all other works, this one was a collaboration.
Digital & Analog Installation Ableton, MAX Cycling, Kinect ZHdK Propädeutikum, 2021
Result
Our process started with asking ourselves; why do we like an art installation? Just like an actor/actress, it can take on different roles and immerse itself into its surroundings. Trying to make use of this adaptability, we explored many ways to incorporate technology into four walls. Through continuous trying of what works and feels good, music had an astonishing overall effect on us. That's why we decided to make it the main element. As soon as you become a part of the room, the room becomes a part of you.
We started with combining different hard- and software to check whether or not our goal was possible. Using Ableton, Max Cycling, and a Kinect sensor we were able to achieve a simple version of our end product. With a quick integration of Open Sound Control, we were able to start body-storming. We set up four speakers and walked around in it with one sound playing. Via OSC we simulated a coordinate system and moved a dot around according to our real-life position in the room. This gave us an idea of what we need and what it should feel like to be a part of this installation.
Process
For the input, we used a Kinect sensor, which we mounted on the ceiling. As a Kinect can only map depth and not track actual people, we had to find a workaround. We used the program Max Cycling 74, a visual programming software that integrates well into Ableton. To create a mapping plugin we used a script originally developed for touchscreen tracking and modified it to be able to track people in a “blob-tracking” manner. With this, we made it possible to track different people and save their coordinates in an array to forward them into the next part in Ableton.
Tracking People
To turn the coordinates into different sounds we put them into an x/y axis grid representing them with a dot. As an example, a person walks into the room and gets assigned an instrument that has two different effects on it, such as reverb and pitch shift. As the person moves along the x-axis the reverb gets heavier and heavier, whereas if he/she walks along the y-axis the instruments pitch gets shifted slightly higher or lower. This allows for an immense amount of different sound combinations for each instrument alone, but what makes it truly an interactive experience is when multiple people combine their instruments to create a completely new and unique kind of music every second.
Using an ambisonic system built into ableton, we were able to transmit the position of the person in the installation into a surround sound system, giving the illusion to others that your instruments sound is really coming from your position.
Creating the Music
As for the actual “music” which should be playing in the end, we went for an atmospheric sound, inspired by artists like Tim Hecker and Brian Eno. More specific we decided to create 7 different tracks which would be randomly assigned to each person. For the overall feeling we created 3 ambient and one melodic track so, no matter what combination of instruments are applied they don't compete with each other for their own space in the music. For a bit more rhythm we made one percussion track, one bassline track, and one set of chords. For the interaction part, each instrument had two different effects applied on it so you as a part of the installation can affect what you sound like. Each track works with any combination of other tracks making every sound feel right and unique.