کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
525772 | 869024 | 2012 | 12 صفحه PDF | دانلود رایگان |

In this work, we propose a new integrated framework that addresses the problems of thermal–visible video registration, sensor fusion, and people tracking for far-range videos. The video registration is based on a RANSAC trajectory-to-trajectory matching, which estimates an affine transformation matrix that maximizes the overlapping of thermal and visible foreground pixels. Sensor fusion uses the aligned images to compute sum-rule silhouettes, and then constructs thermal–visible object models. Finally, multiple object tracking uses blobs constructed in sensor fusion to output the trajectories. Results demonstrate the advantage of our proposed framework in obtaining better results for both image registration and tracking than separate image registration and tracking methods.
► Integrated framework for multimodal registration, sensor fusion, and tracking.
► Application in approximately far-range scene monitoring using uncalibrated cameras.
► Comprehensive survey of multimodal video surveillance state-of-the-art methods.
► Iterative feedback scheme for improved trajectory-based registration and tracking.
► New method of transformation matrix selection based on fusion score.
Journal: Computer Vision and Image Understanding - Volume 116, Issue 2, February 2012, Pages 210–221