Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
713378 | IFAC Proceedings Volumes | 2014 | 5 Pages |
The ability to robustly track a human is an essential prerequisite to an increasing number of applications that needs to interact with a human user. This paper presents a robust vision based algorithm to track a human in a dynamic environment using interest point-based method. The tracking algorithm is expected to cope with changes in pose, scale, illumination as well as camera motion. The interest point based (e.g. SURF) tracking methods suffer from the limitation of unavailability of sufficient number of matching key points for the target in all frames of a running video. One solution to this problem is to have an object model which contains SURF features for all possible poses and scaling factors. So an object model with all possible descriptors could be created off-line and could be used for detecting the target in each and every frame. However, such a scheme can not be used for tracking an object online. In order to overcome this problem, we propose a new approach which update the object model online and have sufficient matching key points for the target in case of change in the pose as well as scaling. Experimental results are provided to show the efficacy of the algorithm.