A concert of software autonomy in music, free improvising algorithms, generative systems and new interfaces for musically metacreative expression
featuring Daryl Jahnke (guitar), Lisa Cay Miller (piano), François Houle (clarinet)
with autonomous software systems by Oliver Bown, Arne Eigenfeldt, Shelly Knots, George Lewis, Doug van Nort and Michael Young
Curated by Arne Eigenfeldt and Oliver Bown
ISEA 2015 Vancouver
Monday August 17 2015 8pm Goldcorp Centre for the Arts SFU Woodward’s Studio T
Improvising Algorithms presents live duets between improvising musicians and “live algorithms”, which are computer applications that engage with human performers. The software is autonomous, acting impulsively and unpredictably; neither automatic or controlled, it participates in the creative performance. ! Each piece brings together an improvising musician with an improvising piece of software, in some cases for the first time, in others as the culmination of a long-term collaboration.
Arne Eigenfeldt (Canada) with Daryl Jahnke, guitar
In these three movements for metacreative musical agents and improvising guitar, the system is under the watchful eye (and hand) of the composer, mainly to constrain the performance within a concert format (the system is designed for ongoing installations). Agents (musebots) function within an ensemble, communicating their actions and plans, and responding to a organizing agent that generates rhythmic and harmonic structures learned from a corpus. As well as generative audio, three musical robots perform the music. The machine generated musical framework is presented to the performer, allowing him to be an active agent within the creation.
Michael Young (UK) with Lisa Cay Miller, piano
This is one of a developing series of duos for a human and a machine performer. Both “musicians” adapt to each other through mutual listening (i.e., via audio only) and response as the performance develops. The human’s improvisation is encoded by the computer through statistical analysis of extracted features and by cataloguing these in real time. Each observation made by the computer is assigned to a set of musical output behaviors. Recurring features of the player’s improvisation can then be recognized by the computer. The machine “expresses” this recognition by developing, and modifying, its own musical output, just as another player might.
Doug van Nort (Canada) with François Houle, clarinet
This piece presents the Freely Improvising Learning and Transforming Evolutionary Recombination (FILTER) system, in an improvised duo with François Houle. The project explores themes such as sonic gestural understanding, stylistic tendencies, textural shifts and transformations of the lived episodic memory as it develops in the moment of performance. The work was born from a desire to reflect upon, and perhaps model, my own human performance practice with my Granular-feedback Expanded Instrument System (GREIS), wherein I often capture and transform the musical streams from other performers on the fly.
Oliver Bown (Australia) with François Houle, clarinet
Zamyatin is a simple improvising system that has been creatively hacked together by its maker in a bricolage manner. It is part of an ongoing study into software systems that act in performance contexts with autonomous qualities. The system has been tweaked to find interesting degrees of interaction between this responsivity and internal generativity, and then ‘sonified’ through the composition of different output modules. Zamyatin’s name derives from the Russian author whose dystopian vision included machines for systematic composition, that removed the savagery of human performance from music. Did he ever imagine the computer music free- improv of the early 21st Century?
Reciprocal Study for Piano and Bots
Shelly Knots (UK) with Lisa Cay Miller, piano
A new work for piano and a society of sound producing bots. The bots are pre- generated at the beginning of the piece using a semi-random sound producing process. Over the course of the piece, bots are activated when the pianist’s playing is similar to the characteristics of each of the bots. Each time a bot is activated it learns from the pianist’s playing, incorporating the piano’s characteristics into it’s own sound generation. Alongside this process, a pitch tracker is attempting to predict the pianist’s next pitches, gaining accuracy as the piece progresses. When the tracker correctly predicts a pitch, a piano-like sound is played in the electronics part. Both processes aim to evolve a semi-random electronic texture towards the pianist’s playing through a process of exploration.
George Lewis (USA) with François Houle, clarinet, and Lisa Cay Miller, piano
Voyager (the program) analyzes aspects of an improvisor’s performance in real time, using that analysis to guide an automatic composing program that generates complex responses to the musician’s playing. This implies a certain independence of action, and indeed, the program exhibits generative behavior independent of the human performer. The system is not an instrument, and therefore cannot be controlled by the performer. The Voyager project was started during 1985-86, while Lewis was a composer-in-residence at the Studio voor Elektro-Instrumentale Muziek in Amsterdam.