Improving upon musical analyses of conducting gestures using computer vision

Teresa Marrin Nakra, Daniel Tilden, Andrea Salgian

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

For more than ten years, researchers have been developing software-based methods for analyzing the movements of orchestral conductors. Beginning with the Conductor's Jacket research project, researchers have combined wearable sensors with face-on video recordings to improve tracking and understanding of the structure inherent in the gestures. Building upon more recent work employing computer vision, the current project refines the methods for tracking the hands and visualizes the data in a way that reveals more of the underlying structure in an easier-to-view fashion. Such improved methods will enable more specific understanding of the functions of conducting gestures, and allow for more concrete applications of those gestures to interactive conducting systems for public exhibits, video games, and integrating dynamic audio enhancements to live concerts.

Original languageEnglish
Title of host publicationInternational Computer Music Conference, ICMC 2010
PublisherInternational Computer Music Association
Pages230-233
Number of pages4
ISBN (Electronic)0971319286
StatePublished - 2010
EventInternational Computer Music Conference, ICMC 2010 - New York City and Stony Brook, United States
Duration: 1 Jun 20105 Jun 2010

Publication series

NameInternational Computer Music Conference, ICMC 2010

Conference

ConferenceInternational Computer Music Conference, ICMC 2010
Country/TerritoryUnited States
CityNew York City and Stony Brook
Period1/06/105/06/10

Fingerprint

Dive into the research topics of 'Improving upon musical analyses of conducting gestures using computer vision'. Together they form a unique fingerprint.

Cite this