Faculty Advisor

Richard Campbell

Faculty Advisor

Frederick Bianchi

Faculty Advisor

David Cyganski

Abstract

This work involves the design and implementation of a real-time Machine Vision-based Human Computer Interface (HCI) that analyzes and interprets a music conductor's gestures to detect the musical "beat". This HCI system interfaces directly with the "Virtual Orchestra", an electronic MIDI sequenced "orchestra". Prior to the development of this HCI system, the real time control of the tempo of the "Virtual Orchestra" could only be controlled by tapping a tempo on a MIDI controller device--a process that is foreign to most music conductors. The real-time beat information detected by this HCI system allows a conductor to conduct the "Virtual Orchestra" as if it were a live orchestra. This system was developed using the Broadway real-time color image capture board manufactured by Data Translation, Incorporated. The implementation involved the use of Microsoft Visual C++, Microsoft Foundation Classes (MFC) for the Graphical User Interface (GUI), Video For Windows (VFW), MIDI note generation, and Intel assembly level code optimization. Algorithms were developed for rapid RGB color thresholding, multiple contour extraction, fast contour based area and center of mass calculations, and gesture interpretation. Real time, live-video interpretation has been achieved and an end-to-end system has been demonstrated in conjuction with a MIDI sequencer.

Publisher

Worcester Polytechnic Institute

Degree Name

MS

Department

Electrical & Computer Engineering

Project Type

Thesis

Date Accepted

1999-05-11

Accessibility

Unrestricted

Subjects

assembly, broadway, contour extraction, virtual orchestra, MIDI, area calculation, center of mass calculation, Video For Windows, color thresholding, Computer vision, Conducting, Data processing, Conductors (Music), MIDI (Standard), C++ (Computer program language), Human-computer interaction

Share

COinS