Article ID Journal Published Year Pages File Type
525772 Computer Vision and Image Understanding 2012 12 Pages PDF
Abstract

In this work, we propose a new integrated framework that addresses the problems of thermal–visible video registration, sensor fusion, and people tracking for far-range videos. The video registration is based on a RANSAC trajectory-to-trajectory matching, which estimates an affine transformation matrix that maximizes the overlapping of thermal and visible foreground pixels. Sensor fusion uses the aligned images to compute sum-rule silhouettes, and then constructs thermal–visible object models. Finally, multiple object tracking uses blobs constructed in sensor fusion to output the trajectories. Results demonstrate the advantage of our proposed framework in obtaining better results for both image registration and tracking than separate image registration and tracking methods.

► Integrated framework for multimodal registration, sensor fusion, and tracking. ► Application in approximately far-range scene monitoring using uncalibrated cameras. ► Comprehensive survey of multimodal video surveillance state-of-the-art methods. ► Iterative feedback scheme for improved trajectory-based registration and tracking. ► New method of transformation matrix selection based on fusion score.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,