top of page

Acerca de

Keyboard

technology + research

snare designer (2019)

Having used Pure Data extensively to build adaptive music projects, I began to put it to use in my client work. One tool I designed for this purpose was a snare synthesiser. 

 

I had a client who would routinely ask for distinctive snare sounds, and I noticed that each of these sounds could be reduced to a unique blend of fundamental, partials, noise, and envelopes. 

 

With the help of a few tutorials that outlined the ways in which a snare is commonly synthesised, I got to work on manifesting these principles in a Pure Data patch, creating an accessible front-end, and hiding all of the complex operations under the hood in sub-patches and abstractions.

gesture-controlled instrument (2018)

In 2018, I developed an instrument, with a view to fully reconnect with myself as a performer, and bring adaptive music to the streets of London. 

 

As a foundation, I used the 'Bela' platform, and programming languages Pure Data and C++.  

 

I created the hardware from scratch, requiring a steep learning curve, and ending up with a buskable, waterproof, super-responsive instrument. 

reciprocal music: interactive music as a standalone (2017)

In 2016 and 2017, I carried out several projects to investigate new forms of music - specifically, recorded music that responds to the input of the listener.  

 

This included forming a community group, 'London Interactive Music Meetup', building relationships with universities like Queen Mary and Goldsmiths, and getting to know some of the leading creators of what could broadly be called 'adaptive music'. 

 

In September 2017, at the conclusion of these projects, I summarised my research in this informal paper. 

​

london interactive music meetup (2016 / 17)

In 2016, as part of an effort to investigate new forms of music, I formed 'London Interactive Music Meetup' (LIMM), hosting open events across London. 

 

In its first 18 months, I hosted over 16 events, including regular workshops, and the Interactive Music Summit at Google.

 

Topics centred on music that could be influenced in real-time by the listener, with particular focus on generative, procedural, reactive, adaptive, dynamic, non-linear, and algorithmic music. 

 

The group brought together a diverse array of participants and speakers, from artists and musicians to academics, developers, entrepreneurs, and enthusiasts. 

bottom of page