Synthesizing expressive music through the language of conducting

Research output: Contribution to journalArticlepeer-review

Abstract

This article presents several novel methods that have been developed to interpret and synthesize music to accompany conducting gestures. The central technology used in this project is the Conductor's Jacket, a sensor interface that gathers its wearer's gestures and physiology. A bank of software filters extracts numerous features from the sensor signals; these features then generate real-time expressive effects by shaping the note onsets, tempos, articulations, dynamics, and note lengths in a musical score. The result is a flexible, expressive, real-time musical response. This article features the Conductor's Jacket software system and describes in detail its architecture, algorithms, implementation issues, and resulting musical compositions.

Original languageEnglish
Pages (from-to)11-26
Number of pages16
JournalInternational Journal of Phytoremediation
Volume21
Issue number1
DOIs
StatePublished - 2002

Fingerprint

Dive into the research topics of 'Synthesizing expressive music through the language of conducting'. Together they form a unique fingerprint.

Cite this