(2006) [PDF] "Towards Autonomous Agents for Live Computer Music: Realtime Machine Listening and Interactive Music Systems" PhD Thesis. Centre for Science and Music, Faculty of Music, University of Cambridge. Submitted 17/08/06, Viva 17/11/06 Officially Released 25/01/07
Musical agents which can interact with human musicians in concert situations are a reality, though the extent to which they themselves embody human-like capabilities can be called into question. They are perhaps most correctly viewed, given their level of artificial intelligence technology, as `projected intelligences', a composer's anticipation of the dynamics of a concert setting made manifest in programming code. This thesis will describe a set of interactive systems developed for a range of musical styles and instruments, all of which attempt to participate in a concert by means of audio signal analysis alone. Machine listening, being the simulation of human peripheral auditory abilities, and the hypothetical modelling of central auditory and cognitive processes, is utilised in these systems to track musical activity. Whereas much of this modelling is inspired by a bid to emulate human abilities, strategies diverging from plausible human physiological mechanisms are often employed, leading to machine capabilities which exceed or differ from the human counterparts. Technology is described which detects events from an audio stream, further analysing the discovered events (typically notes) for perceptual features of loudness, pitch, attack time and timbre. In order to exploit processes that underlie common musical practice, beat tracking is investigated, allowing the inference of metrical structure which can act as a co-ordinative framework for interaction. Psychological experiments into human judgement of perceptual attack time and beat tracking to ecologically valid stimuli clarify the parameters and constructs that should most appropriately be instantiated in the computational systems. All the technology produced is intended for the demanding environment of realtime concert use. In particular, an algorithmic audio splicing and analysis library called BBCut2 is described, designed with appropriate processing and scheduling faculties for realtime operation. Proceeding to outlines of compositional applications, novel interactive music systems are introduced which have been tested in real concerts. These are evaluated by interviews with the musicians who performed with them, and an assessment of their claims to agency in the sense of `autonomous agents'. The thesis closes by considering all that has been built, and the possibilities for future advances allied to artificial intelligence and signal processing technology.
Collins, Nick (2006) "Towards Autonomous Agents for Live Computer Music: Realtime Machine Listening and Interactive Music Systems", PhD thesis, University of Cambridge.
Associated code (for SuperCollider 3)
bbcut2 beat tracking, event analysis and automated audio cutting library (Released under the GNU GPL).
Five Interactive Music Systems Source files, including scores and code, for all five interactive music systems as described in my PhD thesis. Note that some systems require other packages from this page. (Released under the GNU GPL).
LiveCoding1 Live coding mixer, also required for two of the five works above. (Released under the GNU GPL).
Qitch Constant Q transform pitch tracker SC3 plugin. Source code and precompiled version for Mac G4