کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
526413 869109 2008 17 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Image segmentation algorithm development using ground truth image data sets
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله
Image segmentation algorithm development using ground truth image data sets
چکیده انگلیسی

A methodology is presented for making use of ground truth, human-segmented image data sets to compare, develop and optimize image segmentation algorithms. Central to this question is the problem of quantifying the accuracy of the match between machine and reference segmentations. In this regard, the paper introduces a natural extension to the concept of precision-recall curves, which are a standard evaluation technique in pattern recognition. Computationally efficient match measures defined so as to benefit from the availability of multiple alternative human segmentations, are also proposed. The Berkeley image segmentation data set is used to select among the proposed measures, which results in a validation of the local best fit heuristic as a way to best exploit reference segmentations. I then show how the resulting match criterion can be used to improve the recent SRM segmentation algorithm by gradual modifications and additions. In particular, I demonstrate and quantify performance increases resulting from changing color coordinates, optimizing the segment merging rule, introducing texture, and forcing segments to stop at edges. As modifications to the algorithm require the optimization of parameters, a mixed deterministic and Monte-Carlo method well adapted to the problem is introduced. A demonstration of how the method can be used to compare the performance of two algorithms is made, and its broad applicability to other segmentation methods is discussed.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computer Vision and Image Understanding - Volume 112, Issue 2, November 2008, Pages 143–159
نویسندگان
,