Article ID Journal Published Year Pages File Type
537554 Signal Processing: Image Communication 2008 11 Pages PDF
Abstract

This paper proposes a new perceptual interface for the control of computer-based music production. We address the constraints imposed by the use of musical meta-instruments during live performance or rehearsal by tracking feet motion relatively to a visual keyboard. The visual attribute stands for the fact that, unlike its physical counterpart, our keyboard does not involve any force feedback during key-presses. The proposed tracking algorithm is structured on two levels, namely a coarse level for foot regions, and a fine level for foot tips. Tracking works in real-time and handles efficiently feet regions merging/unmerging due to spatial proximity and cast shadows. The output of the tracking is used for the spatiotemporal detection of key-“press” events.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,