Tim Yates - Spatial Audio Residency

Tim Yates, Hackoustic co-founder and audio innovator, has been working in spatial audio and ambisonics during his recent IKLECTIK and Amoenus Residency, that included a public showcase. We interviewed him to explore his work, and to look at the opportunities as well as challenges of working in spatial audio.

Tim Yates - photo @LiminalWarp


Tim Yates is an instrument developer, sound-artist, musician and technologist who performs and builds interactive sound installations and makes instruments for performance and installation. He’s shown his work at, among many other places, the Tate Modern, the V&A and London Design Week and internationally at Xi’an International Maker Faire and Siestes Électroniques. He is the Founder and Director of Hackoustic, a group dedicated to instrument building, acoustic hacking and sound-art.

How did you get into Spatial Audio

It grew out of the work I was doing while studying for a Masters degree in composition at the Royal College of Music around 10 years ago. I specialised in electronic and electro-acoustic music so spent a lot of time in the studios there. They had enough speakers to set up a multi-channel system so I started making pieces - compositions and installations - using those. In particular, I made some pieces for a laptop orchestra that I set up while I was there and an interactive installation called Drone Room, that forms the basis of one of the instruments in my current project.

Is it a hard thing to get started in

The biggest barrier to getting started is getting access to a suitable speaker rig and the space to set it up. To do anything spatial you need at least four speakers, but to make good use of technology like ambisonics you need more like 13, as well as the ability to mount them at different heights, including directly overhead. Most people don’t have access to that kind of rig if they’re not at a college or other institution. After leaving the RCM I had to put my spatial audio ideas aside for that reason which was very frustrating. That’s the huge value of a system like Amoenus which is set up to support independent artists to use this technology. Once the hardware’s in place, it’s pretty straightforward to get started with everything else you need. The tools will feel familiar to anyone working with music production or tech, even if the implementation ends up being quite different.

Tim Yates showcase - photo Kitmonsters

What kind of audio projects had you created prior to the Residency

I’ve worked on so many different kinds of projects and performances it’s hard to narrow it down! I’ve performed in various contexts ranging all the way from solo classical guitar recitals, through to playing free-improv with nothing but a bit of sandpaper and a shelf bracket! I also make interactive sound-art, build my own instruments, and have produced a wide variety of different shows and events. For example I was executive producer on an accessible touring show called Planted Symphony. I even recently ran a workshop at Imperial College exploring the sex lives of mosquitos using interactive music tech! I founded and run a group called Hackoustic withTom Fox (aka Vulpestruments) providing a platform for musicians and instrument makers to show this kind of work.

I’m also particularly involved with developing accessible instruments in partnership with disabled musicians - which has been a big part of my practice for the last few years. There are far too many people in the world who don’t have an instrument that they can play because traditional instruments don’t work for them. There’s loads of scope for making sure that everyone has an instrument that they can use to express themselves musically, and I love trying to figure out ways to make that possible.

In general I’m interested in working outside of traditional instrument paradigms - finding new ways of working with sound, instruments, interfaces and objects to explore ways of expanding the possibilities for musical expression. There’s so much space to play with these things - I find it eternally interesting and exciting!

Tim Yates showcase - photo @LiminalWarp

Experimenting with spatial audio and ambisonics

Tell us about what you set out to do for the Residency

The residency was funded by a Develop Your Creative Practice grant from Arts Council England which is intended to help artists move into new areas of practice or develop new skills. With that in mind, my goal was to learn about spatial audio - in particular ambisonics - and begin to explore how it could be used to create new kinds of sonic experiences. It was a rare and valuable opportunity to experiment, try out new ideas and really push the limits of what I’m able to do, both technically and creatively. With that in mind I tried to work on a number of different fronts including experimenting with what an instrument might be in terms of the type of interaction that’s possible and the physical experience of playing. Also how an ambisonic setup changes the relationship between the audience, instruments and music. Because of the nature of spatial sound, there’s no longer such an obvious necessity for a particular focal point in the room as there is with stereo sound - no imperative for everyone to face the same way or look at the same thing. Audiences and performers can have the freedom to move around the space which can really change their experience of the music.

I also wanted to explore how we can step outside of traditional performance paradigms, particularly inspired by groups like Punchdrunk, who’ve transformed the way that a theatre show can be presented. Once you can let go of the traditional ideas of a musical performance structure, there are many different areas to explore. For practical reasons my showcase took the form of three short shows over one evening, but I like the idea of having a long-running audience interactive installation that has periodic interventions by live musicians moving through the space and interacting with the audience; not on a stage, but in amongst the installations performing along with the audience members.

There was a lot I was trying to do (maybe too much!) but I feel that all of these ideas are tied very closely together and need to be developed in parallel. With that in mind I knew that this would only be the very first iteration of a much longer term project.

Tim Yates showcase - photo Kitmonsters

Creating new instruments with spatial audio

What instruments or instruments within an instrument did you create

The central idea behind the instruments that I made was that they’re instruments that you’re in rather than that you hold and play. There were three main instruments that I used for the showcase, although there are others that I‘m developing but didn’t use because I didn’t want to overwhelm the room with too many things! The designs are all still in the early stages and the instruments haven’t yet got names, but hopefully those will occur to me as they develop.

The first one is a wireless handheld drone instrument (as in it makes a continuous sound rather than being a flying robot!) where you draw the shape of the waveform that’s playing in the room on a portable touch screen. There are three of these and I like the idea that it’s a bit ambiguous whether it’s three separate instruments or one instrument that requires three players! I wanted to give really low level access to digital sound, so the shape that you draw is actually being written directly into the wavetable of a series of oscillators in real time. These oscillators move independently around the room which sets up continuously shifting relationships - standing waves and interference patterns - between those sounds and the space. It’s quite a subtle effect, but a way of trying to really use the system and spatialisation as an intrinsic part of the instrument design. It’s a drone that forms the underlying bedrock of the sound in the room for the whole installation.

Tim Yates showcase - photo @LiminalWarp

Another instrument was based around a lever that I hacked from a game controller for flight simulator games. I want to explore ways to make what seem like really simple interactions musically compelling and that encourage us to listen to the sound with more attention. To play this one, you use a foot pedal to start and stop the sound, and the lever then controls a synthesis parameter - just one - of an underlying hardware synthesiser. In fact the lever is controlling something like a quarter turn of a single knob of the synth which has maybe 30 different knobs on it and a lot of other complexity. It’s rare that we really pay attention to the minutiae and detail of the sounds that we play, but there’s whole worlds of sound in even the smallest of these changes and I find it really compelling to explore that.

The final set of instruments were panels with built in touch sensors. Each triggered very different kinds of sounds - one was a sub-bass, one used environmental sounds and the other short percussive sounds - and each one had a slightly different pattern of interaction with the sound moving in the space differently. I covered these with different fabrics so that they had a physical texture that was part of the interaction. We’re used to tech being shiny plastic, glass and brushed aluminium, and I think that we can bring a lot to an interaction by playing with materials and the way they feel in the hand. This was really my first time exploring this idea and in the future I’d like to expand the materials and textures much further.

Tim Yates showcase - photo Kitmonsters

The use of light and interactivity

A common element in the instruments is the use of light that was built into all of them in various ways. One of the challenges with building interactive sonic tech, especially if it’s being used in a busy musical environment like this, is making it really intuitive to use so that they feel very responsive and the audience can really tell that they’re interacting with the instrument and know what impact they’re having. I tried to use light to do this - both to direct how the instruments are used - anything that is lit up can be touched - but also to show that the instrument has registered the interaction. Given that this was part of an R&D project, I decided to challenge the audience in that I gave almost no direction as to what the instruments were and how they could be used. I wanted to see really how intuitive they are and where I need to give more guidance and change the design to make the interaction more obvious. Those interactive elements weren’t all successful in the showcase and the feedback I got will go a long way to inform future iterations.

Isa Barzizza, Tim Yates showcase - photo Kitmonsters

What kinds of tools both software and physical did you use

There were two main sets of tools that I used, one for working with the ambisonic system and the other for actually building the instruments.

For the ambisonics I used a free Ableton plugin family called Envelop to do the mixing on my laptop that then sent the audio over an ethernet based audio routing protocol Dante into a Max patch that Amoenus have developed that uses SPAT to output the sound to the speakers. All this via a Dante enabled Focusrite audio interface. It’s a fairly complicated chain to get going, but pretty intuitive to use once it’s set up.

For the instruments themselves, I used a combination of Raspberry Pi and Teensy’s for the control hardware, coded in C/C++ and using OpenFrameworks for the graphical elements. A combination of MPR121 and Bela Trill sensors for the touch interfaces and a hacked flight simulator game controller for the lever instrument. All the instruments communicate over WiFi so I have a high powered gaming WiFi router to make that as reliable as possible. It was an extra challenge that I wasn’t expecting just to get hold of the hardware during the project - global supply chain issues meant that most of the tech that I needed wasn’t available most of the time, so I spent far more time that I’d have liked tracking down the bits I needed!

Are there challenges that come with working in Spatial Audio

Spatial audio can be challenging in various ways, not least getting to grips with the spatialisation software and different ways of controlling and positioning the sound. There are so many possibilities for how it can be used and figuring out how to make that work can be daunting. That’s also the exciting thing about it though - sound works differently in that context and using that potential is something that I’ve only begun to scratch the surface of. It feels like a technology that’s just finding its place - even though ambisonics have been around for a long time, it hasn’t been available to the vast majority of musicians - and so there’s a lot of space to experiment and explore.

Because it’s a medium that doesn’t have such a defined performance tradition as traditional stereo audio, the field is wide open for experimentation which is both a blessing and a curse!

Joris Beets - photo @LiminalWarp

The showcase - interaction and live musicians

How did the Showcase event go

The showcase was really successful and served as a fantastic proof of concept for the project as a whole. It was very exciting to try out a lot of things that I’d never done before - everything from the basic setup of the instruments in the room, new types of instrument, different kinds of audience interactions and experimenting with mixing interaction and live musicians. I was really lucky to work with the incredible harpist Joris Beets, who’s also a very skilled instrument maker. He’s a sensitive and imaginative improvising musician and his performance brought all the different elements together in a way that was really compelling. Overall I learnt an incredible amount from the event, and also had a great time!

What kind of feedback did you get

I was absolutely delighted with the feedback I got both in person on the night and in the feedback forms that the audience filled in. In particular, people really enjoyed the way the hierarchy of music and musicianship was changed between the live musicians and the audience and the way that everyone was able to play together in the room. I also got some really useful feedback about what was and wasn’t intuitive about the instruments and how easy it was for people to know exactly what effect they were having while they were playing. These were things that I could only discover by trying it out with a real audience and I was very happy with the constructive feedback people gave me on where improvements could be made.

Tim Yates and Joris Beets - photo Kitmonsters

What was it like working with IKLECTIK and Amoenus

It was just great - both organisations were as supportive as they could have been and were a joy to work with. I’ve been looking to get back into spatial audio for a long time, so when I saw that the system was being installed I jumped at the chance to get involved. Christian Dukka who runs Amoenus was incredibly open to letting me use the system and teaching me how it all worked for which I’m eternally grateful. It’s wonderful to see the way he’s building a community of artists and musicians around the system.

I just love working at Iklectik. I’ve been putting on events there almost since it opened - it’s the spiritual home of Hackoustic and Isa and Eduard who run it are awesome. They work unbelievably hard and with such commitment to support artists and the community - it’s a rare and precious place. I couldn’t have done even half of what I do without it!

Tim Yates showcase - photo @LiminalWarp

Do you have any tips for people who want to work in Spatial Audio

I think the main tip is just to take any opportunity you can get and dive straight in. There is growing interest in this kind of music making and I think that’s only going to increase, so hopefully there’ll be more opportunities available. Amoenus run regular workshops to get you started and there are various other organisations around the world who do the same. If you’re lucky enough to be near one of them, then just show up and get started! There’s so much to explore.

What’s next for you and the project

I’ve really only got started so there’s a lot to focus on over the next months and years. The residency was just the first R&D phase and there’s quite a bit of work to be done technically and creatively to get it ready for where I want to go next. My plan is to spend some time perfecting and refining what I’ve developed so far and hopefully do some more shows with Amoenus so there’ll be some more showcases later in the year where I test all that out. Once the first phase of the project is fully completed, I’m planning to tour it as far and wide as possible! I’m really excited about the possibilities for this kind of work, so I’m just going to keep exploring and pushing myself to really make something amazing.