Article ID Journal Published Year Pages File Type
6867087 Robotics and Autonomous Systems 2018 18 Pages PDF
Abstract
Velocity estimation is essential for multicopters to guarantee flight stability and maneuverability. For such a purpose, this paper proposes a new method for multicopter velocity estimation based on visual and inertial information in GPS-denied or confined environments. In this method, no map, artificial landmark of the environment is required, and only the off-the-shelf onboard sensors in a multicopter including a low-cost Inertial Measurement Unit (IMU), a downward-looking monocular camera and an ultrasonic range finder facing downwards are exploited to constitute the vision motion constraint. This constraint connects metric velocity with the point correspondences between successive images in which an efficient approach based on Mean Shift (MS) algorithm is developed to detect outliers and select optimal matching points. Then, it is theoretically verified that the estimation system is observable based on observability analysis. Furthermore, combined with the vision motion constraint and a multicopter dynamic model, the metric velocity is estimated using a standard Linear Kalman Filter (LKF). Finally, the proposed method is tested with a collection of synthetic data from simulation as well as flight experiments using real data from DJI Matrice 100 and Guidance. The simulation and experimental results indicate that the proposed method can accurately estimate the velocity of the multicopter in GPS-denied or confined environments.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , , ,