Interfacing Jazz: A Study in Computer-Mediated Jazz Music Creation and Performance
This page is dedicated to the research on automatic music generation of jazz music, developed for my PhD at the UT Austin/Portugal Digital Media doctoral program, at the University of Porto, and SMC research group at INESC-TEC.
This research focuses on the study and development of computer-mediated interfaces and algorithms for music performance and creation. It is mainly centered on traditional Jazz music accompaniment, and explores the meta-control over musical events as a way to potentiate the rich experience of playing jazz by musicians and non-musicians alike, both individually and collectively.
For this, a group of specially designed algorithms and control interfaces that implement intelligent, musically informed processes to automatically produce stylistically correct musical events.
This work was the continuation of the research led in the development of GimmeDaBlues, an iOS app that provides an intuitive interface and algorithmic music generator.
The interactive interface and instrument algorithms were further developed and implemented in the creation of MyJazzBand, an interactive musical installation that allows up to four users to play in a virtual jazz band, using a custom-developed graphical interface on a multi-touch display.
The text below will provide an insight on the algorithms and interface strategies developed during this research and implemented in MyJazzBand.
Piano dynamic voicing calculator
The piano interface implements a dynamic voicing calculator algorithm. Each time the song chord changes, the algorithm receives the chord and associated scale information, and calculates a set of voicings, one for each of the available keys on the virtual keyboard.
Using as example a C 7 chord, using the mixolidian scale, the calculator algorithm will map the scale to the keys:
The omission of the 4th degree (F) is
It then calculates a voicing for each key, using the scale note as the top note, and completing with two more notes below this one, that fit the current chord, and style criteria.
The first three (left) keys on the virtual keyboard remain with only one note assigned, as using chords in this low register is usually avoided. To each of the following keys, a three-note voicing is assigned.
The bass algorithm is based on the automatic generation of phrases that define shapes or contours by interpolating from one chord to the next, calculating the correct in-between notes to provide natural and musical bass lines.
Dynamic response of the virtual drummer and bass player
MJB’s virtual drummer and bass player algorithms are able to change dynamically, according to different activity levels. This means that their behaviour can vary according, for example, to the density level at a given moment of the song, or in response of the user activity. As the user plays more notes, the activity level will rise. This level is tracked and used to control the drum and bass algorithms, so that they will respond as a human jazz band would usually do.
In the following example, the activity level of the drums and bass algorithms are controlled manually, with the initial level set to zero, during most of the first chorus, and increasing to maximum (127) during the second chorus.