Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10327020 | Robotics and Autonomous Systems | 2014 | 12 Pages |
Abstract
In this work we present an in-situ method to compute the calibration of two sensors, a LIDAR (Light Detection and Ranging) and a spherical camera. Both sensors are used in urban environment reconstruction tasks. In this scenario the speed at which the various sensors acquire and merge the information is very important; however reconstruction accuracy, which depends on sensors calibration, is also of high relevance. Here, a new calibration pattern, visible to both sensors is proposed. By this means, the correspondence between each laser point and its position in the camera image is obtained so that the texture and color of each LIDAR point can be known. Experimental results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and a spherical camera.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Angel-Iván GarcÃa-Moreno, Denis-Eduardo Hernandez-GarcÃa, José-Joel Gonzalez-Barbosa, Alfonso RamÃrez-Pedraza, Juan B. Hurtado-Ramos, Francisco-Javier Ornelas-Rodriguez,