Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
7122782 | Measurement | 2016 | 10 Pages |
Abstract
In the second part, the human action recognition algorithm is implemented for upper-body gestures. A human action dataset is also created for the upper-body movements. Each action is performed 10 times by twenty-four users. The collected joint angles are divided into six action classes. Extreme Learning Machines (ELMs) are used to classify the human actions. Additionally, the Feed-Forward Neural Networks (FNNs) and K-Nearest Neighbor (K-NN) classifiers are used for comparison. According to the comparative results, ELMs produce a good human action recognition performance.
Related Topics
Physical Sciences and Engineering
Engineering
Control and Systems Engineering
Authors
Emrehan YavÅan, AyÅegül Uçar,