کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
527857 869391 2012 17 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Estimating pose of articulated objects using low-level motion
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله
Estimating pose of articulated objects using low-level motion
چکیده انگلیسی

In this work a method is presented to track and estimate pose of articulated objects using the motion of a sparse set of moving features. This is achieved by using a bottom-up generative approach based on the Pictorial Structures representation [1]. However, unlike previous approaches that rely on appearance, our method is entirely dependent on motion. Initial low-level part detection is based on how a region moves as opposed to its appearance. This work is best described as Pictorial Structures using motion. A standard feature tracker is used to automatically extract a sparse set of features. These features typically contain many tracking errors, however, the presented approach is able to overcome both this and their sparsity. The proposed method is applied to two problems: 2D pose estimation of articulated objects walking side onto the camera and 3D pose estimation of humans walking and jogging at arbitrary orientations to the camera. In each domain quantitative results are reported that improve on state of the art. The motivation of this work is to illustrate the information present in low-level motion that can be exploited for the task of pose estimation.


► It is shown that the motion of a sparse set of tracked features can be used to estimate human pose.
► It is shown that gait phase can be extracted without exploiting the spatial distribution of the features.
► 3D human pose is estimated using only a monocular view.
► The presented approach does not require the pose to be initialized in the first frame.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computer Vision and Image Understanding - Volume 116, Issue 3, March 2012, Pages 330–346
نویسندگان
, , ,