research

Welcome to the research blog for a practice research project for pyka focused around the creation of a Score for The Expression Orchestra. A Score is a non-linear composition and sound world designed to be played using the instruments of the orchestra. The instruments are bespoke objects that function as MIDI controllers; the raw data from the sensors in the instruments is converted to MIDI information that can trigger and manipulate sounds and effects within an Ableton Live set. Composing a Score then is ‘composing for potential’, creating the sonic parameters for performers to play within. This project will explore the process of creating Scores and reflecting on what works, what doesn’t, and what that tells us about composing in this way. The broader aim is to build towards a kind of ‘composing for potential’ guide for future Score producers. Along the way, the outcomes will include: a brand new Score developed through composition experiments, public testing with participants to see how the Score works in live performance, and collaborative reflections with another Score creator (Leigh Davies). The process will be informed by existing practice and literature on non-linear composition and interactive sound design. The blog will document this journey — the practical experiments, the theory, the surprises — as I try to sketch out a praxis for making Scores that others can build on. -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+

Project Context

The Expression Orchestra is an ensemble of multi-sensory instruments that provide diverse access to shared musical expression. The Orchestra consists of 4 elements: Performers, Instruments, Conductor, and Scores. Performers are essential to the communal performance experience. Without them, the space is dark, silent, and still. With them, a collaborative composition can unfold. The Instruments are bespoke physical objects, each designed in a unique way to be performed and fulfil a different role within the composition. The Conductor is the silent collaborator that adaptively interprets performers' interactions, and honours their creative intentions within the sensory outputs of the orchestra. The Scores are created to imbue the orchestra with a creative potential. Instead of traditional fixed musical works, Scores allow performers to play and explore within a carefully crafted composition space that manifests across the senses. Together, these elements create an immerse environment of multi-sensory potential, where participants become performers and contribute to an open-ended collaborative performance, regardless of previous experience or perceived musical ability. The foundational concept was developed in Additional Learning Needs (ALN) settings in an R&D project funded by the Arts Council of Wales. Here we learnt that attempting to meet diverse needs can lead to a one-size-fits-all approach, producing feature rich, and often over-engineered solutions. Reflections on this led to the founding concept of the project, that access and richness can be better achieved through polyphony. Support from the Atsain Fund saw participatory design in ALN settings result in early prototypes and performances in public showcases. We then built on the expressive capacity of the instruments utilising Machine Learning and data sets gathered with workshop participants, through the MyWorld Sandbox programme at Pervasive Media Studio. The residency resulted in a public showcase featuring refined prototypes of the instruments. Here we reflected on and codified our Access-First Design principles into a public-facing document. We also learnt about considerations for ethical consent in training Machine Learning models.  With funding from Media Cymru, we’re currently making The Expression Orchestra easily tour-able. We’re refining instrument fabrication to make the prototypes ready for mass engagement. -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+

Phase 1:

Mapping Potentials

This phase involved revisiting the instruments and the affordances of their interactions. Specifically, what I was trying to identify here was in what ways the instruments interactions could be used/mapped within Ableton: did they trigger notes? could they modulate parameters on an instrument or effect? what behaviours/interactions caused these? There are currently 4 instruments in the ensemble, each named after the [inter]action used to play them: THROW, PRESS, ROLL, and PLACE. THROW, PRESS, and ROLL have fairly similar setups in that they are designed for ‘sound making’, and each allow for the following: Note: This is the label given to an instrument’s ability to trigger a note on/off, which happens when the instrument has a registrable impact with another object. For THROW, a note on/off is triggered both when it is thrown, and when it is ‘caught’ (return impact after being throw). For PRESS, a note on/off is triggered each time it is ’stood on’. For ROLL, a note on/off is triggered on impact; so rolling is unlikely to trigger this unless ‘hit’, but likely to trigger if it hits something while rolling. The sensitivity that qualifies a note to trigger on/off is baked into the PD patch, and not currently a concern. There will likely be feedback on how this works during the testing of the SCORE with participants. Modulate: This is the label given to an instrument’s ability to modulate values based on a range in movement from a sensor. For THROW and ROLL, this is the curated accelerometer data from its acceleration speed after being thrown/rolled. For PRESS, this is the pressure / overall force being applied across the surface. This instrument is in the process of being refined, with a new way of configuring the sensors, so iteration and testing how this affects the score will be important. Depth: This is the label given to the quality of the instrument when it is being used in a slow, steady/consistent way, so that it unlocks a new instrument state with an additional 0-127 range is given to work with. For this state to be unlocked the Conductor will recognise the instrument being played in this slow/consistent state for a certain amount of time, and will ‘release’ it from this state once there is a recognisable shift in interaction. For THROW and ROLL this is when they are being moved slowly and consistently. For PRESS this is when there is consistent/steady pressure being applied to the surface. PLACE works slightly differently to the other instruments because it is not ‘sound making’ itself per se, but more of an environmental instrument in that interacting with it effects/controls parameters of the ‘sound making’ instruments, and/or the piece as a whole in some way. PLACE operates by placing a small ball in one of 8 positions, so working similarly to Note in that it triggers on/off states. These are mapped in the diagram below: [Mapping potentials at home and at The Dugout, Abergavenny] [📸 taken by staff at Dugout] -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+ -+-+-+-+

Phase 2:

Sonic Exploration

The mapping of the affordances in the previous phase was essential for gaining an understanding of the different ways the composition could be interacted with or performed. It was helpful to know these parameters and limitations, but I didn’t want them to necessarily inform the sound world. So, this phase started with lots of free play in Ableton: crafting sounds, melodic material, and percussive patterns in the same way I would creating music as jack of the suburbs. It felt very much like a typical composition process for me - working in Session View to test and experiment. This then led to a shift in mindset that was new to me - thinking about how an individual instrument might sound, and how the sounds could be controlled and manipulated considering the parameters mapped out in the previous phase. I started with THROW and felt drawn to utilising its affordances to operate as a tonal/melodic instrument. Starting with what would typically be described as a ‘lead’ line, with notes operating in the higher frequency range, I used a Novation Launchpad Mini to simulate THROW triggering sounds with a synth built from Ableton’s Operator. Aware that there could be 8 variations of THROW’s sounds due to the affordances of PLACE, I noted that this would be THROW #1 (for now at least) as I played with the sound design and performance. This meant giving THROW #1 a scale to work within, and utilising the Random MIDI Effect to trigger pitches at random from the scale, rather than cycling through them in a set sequence. The justification for this feels fairly superficial at this point: for me it feels more playful to play through notes at random, so intervals are a surprise - the predictability of working through a scale sequentially reminds me of scale practice on my guitar. If there was a way to cycle through a composed melody, this would be interesting to explore, but I haven’t reached a point of exploring things with this kind of technical consideration yet. I then worked on the sound that ROLL could make when rolled. https://maxforlive.com/library/device/7393/midi-quantize