Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6883333 | Computers & Electrical Engineering | 2018 | 13 Pages |
Abstract
Hand gestures will become a mainstream method of manipulating human computer interfaces (HCIs). For disabled people with mobility problems, hand gesture-based HCIs should be specifically designed. To achieve effective hand gesture HCIs, this study integrated a mobile service robot platform, three-dimensional (3D) imaging sensors, and wearable Myo armband device. Four kernel techniques are presented: (1) Myo armband software development kit hand gesture recognition using a two-layer hierarchy scheme to significantly increase hand gesture command numbers, (2) identity recognition of users using clustering-based support vector machine classifiers with a designed root mean square surface electromyography (RMS-sEMG) feature, (3) robot vehicle navigation with effective obstacle avoidance using a conceptually simple and computationally fast approach, and (4) efficient vehicle positioning based on the face-detection information of the user provided from the 3D imaging sensor to receive the hand gestures commands of the user with disabilities.
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Networks and Communications
Authors
Ing-Jr Ding, Rui-Zhi Lin, Zong-Yi Lin,