ScreenPlay 1.1 Available to Download

ScreenPlay 1.1 Available to Download

 

GUI Screenshot

ScreenPlay version 1.1 is now available to download for free here!

Download includes suite of Max for Live MIDI Devices, Ableton Live Audio Effect Rack, Liine Lemur GUI templates.

Partials

Partials is a variable spectrum additive synthesiser with MC at its heart. Available as both a Max/MSP collective and Max for Live Instrument, the fundamental concept underpinning the design of Partials is one of variability and unpredictability.

The foundation of the instrument is the ioscbank~ object and, specifically, the dynamic variation of the distribution of partials between the fundamental frequency and uppermost overtone, which can either be linear or increasingly logarithmic/exponential. Unfortunately, an MC variant of ioscbank~ does not exist so it was necessary to utilise poly~ to mimic the functionality of such an object.

Partials comprises three additive voices, which operate in unison, and a sub oscillator. Parameter settings such as partial distribution, filter cutoff, and resonance are controlled by random envelope generators and fluctuate within a range of values as defined by rslider objects or number boxes. Random envelope frequency (the interval between newly generated envelope values) can be defined for each individual parameter, whilst deviation can be applied to create variation between the three unison voices.

Partials can be downloaded for free here!

ScreenPlay 1.1

Since completing my PhD in April 2017 I have started a new job working as a full time Lecturer in Music Production at Edge Hill University. My research time in this role has been dedicated to the continued development of ScreenPlay, building on its existing strengths to make it a more stable, user-friendly, and practicable system.

Whilst some time has been spent on tweaking the underlying Markovian generative and topic-theory-inspired transformative algorithms, the majority of my attention has been focussed on a complete redesign of the GUI, as well as allowing for the Max for Live devices to be used completely independently of the GUI.

For the GUI redesign I decided to move over to Liine’s Lemur, having previously used Hexler’s TouchOSC, in order to harness its massively increased power and flexibility when compared with TouchOSC. The newly redesigned GUI is contained within a single page; a significant reduction from the three pages over which the TouchOSC GUI was previously distributed. Key/scale selection is now accessible via simple drop-down menus, as opposed to the selection matrix implemented in the previous iteration of the GUI. Additionally, changes to the key/scale applied to the grid-based playing surface are now handled entirely within Lemur, with the MIDI note numbers assigned to individual pads changing according to the key/scale selection. Previously, this had been handled within the ScreenPlay-CTRL Max for Live MIDI device, with the pads comprising the playing surface consistently outputting the same MIDI pitch information and this being transposed accordingly upon being received by ScreenPlay-CTRL. Likewise, velocity control of notes as defined by the velocity slider on the GUI is also accounted for inside Lemur, rather than in Max for Live as had previously been the case. The summation of these changes amount to an increased level of stability when using the system as well as reduced latency and CPU load.

In order to constrain the entire GUI to a single page – one of the goals of the redesign – it was necessary to devise a way of accommodating both the button matrix gird and the generative/transformative algorithm controls within the confines of a relatively small space. This has been achieved by utilising the same space on the GUI for both elements and enabling users to switch between to two. The surrounding controls remain unchanged regardless of what is displayed in the centre of the GUI, affording users access to clip, key/scale, tempo, velocity, and quantisation settings at all times.

Changes have also been made to the meta-level controls afforded by the GUI, with parameter controls for existing elements being improved and entirely new meta control functionality being added. Specifically, parameters such as record/global quantisation and clip/loop length are now accessible via drop-down menus (much in the same way as key/scale selection), and clip management has been improved through the removal of dedicated buttons for deletion, which have been replaced by hold-to-delete functionality. In conjunction with the changes to the organisation/distribution of the playing surface and algorithmic controls, all of this amounts to a streamlined interface with reduced demands on screen real estate. Newly introduced meta-level control functionality includes the addition of a play/stop button as well as a MIDI mappable drop-down menu for part selection when the system is being used in single-mode. Both elements are important additions with respect to improving the integration of ScreenPlay into the existing studio/live setups of practising electronic musicians.

Continuing in this vein, controls for the Markovian generative and topic-theory-inspired transformative algorithms have been added directly to the Max for Live MIDI Device GUIs, allowing for them to be used entirely independently of the Lemur GUI. As before, two-way communication between interfaces is exhibited when ScreenPlay is running in multi-mode, in that changes made to global parameters by one user are reflected in the interfaces of the others. Similarly, when running in single-mode, the GUI updates to reflect the status of the currently selected part. This functionality has now been extended so that two-way communication exists between the Max for Live MIDI Device GUIs and the Lemur GUI, which is particularly useful when using ScreenPlay in single-mode as part of an existing compositional/performative setup.

To read more about the conceptual framework underpinning ScreenPlay please refer here.

My First Track with the Teenage Engineering OP-Z

Since the beginning of the new year I have seen a number of live performances on Instagram and YouTube in response to #Jamuary2019. The idea behind the hashtag seems to be to create a new piece of music or a new live performance/jam each day in January. A similar movement has also taken shape within the visual arts under the slightly more clunky hashtags #Creatuanuary and #Creatuanuary2019, whereby artists aim to produce a piece of work resembling a creature of some description for every day in the month of January.

While I haven’t the time to complete a new piece of music every day throughout the month my interest was certainly piqued by #Jamuary2019 as a way to get back into the swing of things and boost my productivity early on in the year; hopefully to continue in the same vein for the duration of the year to come. The piece Z019 is my first response to #Jamuary2019, and also the first piece of music I have made entirely with the OP-Z since picking one up just before Christmas.

I have somewhat taken my time in composing Z019 whilst reading through the manual and familiarising myself with the interface and functionality of the device. While there are some minor issues (the random Parameter Spark seemingly not working) that have required me to compromise on some of my creative intentions but that will hopefully be fixed in future firmware updates, by in large I have found the OP-Z to be an incredibly intuitive, engaging, and, most importantly, fun instrument to work with. My creative process is ordinarily dominated by precise sound design work and parameter modulation, and the fact that each of the synth engines offered by the OP-Z only afford a handful of parameters – but still enough to yield widely varying results due to the variety and number of synth engines from which to choose to begin with – has helped me to focus more on just making music. The myriad ways in which stochastic procedures can be implemented into the user’s workflow when creating with the OP-Z is also something that holds great appeal for me.

Following the completion of Z019, I now intend to begin working on a piece incorporating the OP-Z, Make Noise 0-COAST, and Max/MSP, which will hopefully be finished before the end of January and can act as my second response to the #Jamuary2019 movement.

A WAV file of Z019 can be downloaded here.

 

Barcelona Metro

Originally composed for the Martini Elettrico festival hosted by Conservatorio di Musica Giovan Battista Martini of the University of Bologna in April 2018, the piece Barcelona Metro is founded upon a recording captured many years ago of a busker in a Barcelona Metro station playing a rendition of “Every Breath You Take” by The Police. The sample has been heavily processed using the Iota granular looping Max for Live instrument and can be heard throughout the piece; most notably in the introduction and breakdown sections. In addition to this, a multitude of LFOs employing various waveforms have been used to control a vast array of parameters, from simple panning to the timbre of the kick drum, the glitchy beat repeat effect applied to the drums, and the note duration of the arpeggiated lead. There is also an arpeggiated bass that slowly moves in and out of synchronization with a second lead that, for the most part, mimics the notes played in the bass only higher in register. The accumulation of these stochastic processes results in a piece that is, at times, chaotic and unpredictable, whilst simultaneously retaining many stylistic traits more commonly associated with popular electronic music.

This is the first piece of music I have finished in quite some time; I hope you enjoy it.