Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
720138 | IFAC Proceedings Volumes | 2007 | 8 Pages |
A novel method for estimating vehicle roll, pitch and yaw using machine vision and inertial sensors is presented that is based on matching images captured from an on-vehicle camera to a rendered representation of the surrounding terrain obtained from an on-board map database. United States Geographical Survey Digital Elevation Maps (DEMs) were used to create a 3D topology map of the geography surrounding the vehicle, and it is assumed in this work that large segments of the surrounding terrain are visible, particularly the horizon lines. The horizon lines seen in the captured video from the vehicle are compared to the horizon lines obtained from a rendered geography, allowing absolute comparisons between rendered and actual scene in roll, pitch and yaw. A kinematic Kalman filter modeling an inertial navigation system then uses the scene matching to generate filtered estimates of orientation. Experiments using an instrumented vehicle operating at the test track of the Pennsylvania Transportation Institute were performed to check the validity of the method, and the results reveal a very close match between the vision-based estimates of orientation versus those from a high-quality GPS/INS system.