Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
8878730 | Engineering in Agriculture, Environment and Food | 2017 | 17 Pages |
Abstract
In this study, a visual-based furrow line detection method was developed for navigating an autonomous robot vehicle in an agricultural field. The furrow line detection method integrates a crop or non-crop field identification method, two types of box filters, which are a color-based furrow detection filter and a grayscale separability-based furrow detection filter, and a robust furrow line parameter estimator. In experiments, the performance of our developed method was tested on more than 8000 images of 17 types of test fields: nine types of crop fields (sweet pea, green pea, snow pea, lettuce, Chinese cabbage, cabbage, green pepper, tomato, and tea), and eight types of tilled soil fields. By using a wide camera angle with a low depression angle, the detection rate of the furrow line was 98.0%, the root mean square error (RMSE) of positioning of the furrow line was 12.1 pixels, and the RMSE of angle of the furrow line was 3.8°. Moreover, by using the oblique camera angle, the detection rate was 93.4%, the RMSE of positioning was 23.3 pixels, and the RMSE of angle was 6.1°. The results showed that our method using the wide and oblique camera angles could approximately detect the furrow line in the test fields. The average processing speed was approximately 2.5 Hz for the crop fields and 4.0 Hz for the tilled soil fields. Our method demonstrated a high potential to robustly and precisely detect a single targeted furrow line in the 17 types of test fields.
Related Topics
Life Sciences
Agricultural and Biological Sciences
Agronomy and Crop Science
Authors
Yoshinari Morio, Kouki Teramoto, Katsusuke Murakami,