Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6941969 | Signal Processing: Image Communication | 2015 | 12 Pages |
Abstract
In this paper, we propose a stratified gesture recognition method that integrates rough set theory with the longest common subsequence method to classify free-air gestures, for natural human-computer interaction. Gesture vocabularies are often composed of gestures that are highly correlated or comprise gestures that are a proper part of others. This reduces the accuracy of most classifiers if no further actions are taken. In this paper, gestures are encoded in orientation segments which facilitate their analysis and reduce the processing time. To improve the accuracy of gesture recognition on ambiguous gestures, we generate rough set decision tables conditioned on the longest common subsequences; the decision tables store discriminative information on ambiguous gestures. We efficiently perform stratified gesture recognition in two steps: first a gesture is classified in its equivalence class, under a predefined rough set indiscernibility, and then it is recognized using the normalized longest common subsequence paired with rough set decision tables. Experimental results show an improvement of the recognition rate of the longest common subsequence; on preisolated gestures, we achieve an improvement of 6.06% and 15.09%, and on stream gestures 19.79% and 28.4% on digit and alphabet gesture vocabularies, respectively.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
Clementine Nyirarugira, TaeYong Kim,