Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
412313 | Robotics and Autonomous Systems | 2006 | 16 Pages |
Abstract
A new technique for vision processing is presented which lets a mobile robot equipped with an omnidirectional camera perform appearance-based global localization in real time. The technique is applied directly to the omnidirectional camera images, producing low-dimensional rotation invariant feature vectors without any training or set-up phase. Using the feature vectors, particle filters can accurately estimate the location of a continuously moving real robot, processing 5000 simultaneous localization hypotheses on-line. Estimated body positions overlap the actual ones in over 95% of the time steps. The feature vectors show a graceful degradation against increasing levels of simulated noise and occlusion.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
F. Linåker, M. Ishikawa,