کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
8146940 | 1524115 | 2015 | 29 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
A sparse representation-based method for infrared dim target detection under sea-sky background
ترجمه فارسی عنوان
یک روش مبتنی بر نمایه پراکنده برای تشخیص هدف کم نور مادون قرمز در زیر زمینه پس زمینه دریا
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
تصویر مادون قرمز، تشخیص هدف، نمایندگی انحصاری، یادگیری فرهنگ لغت
موضوعات مرتبط
مهندسی و علوم پایه
فیزیک و نجوم
فیزیک اتمی و مولکولی و اپتیک
چکیده انگلیسی
Automatic detection for infrared (IR) dim targets under complex sea-sky background is a challenging task. To explore an effective solution to the problem, this paper develops a sparse representation-based method by learning a sea-sky background dictionary. This framework is mainly composed of three modules: background dictionary learning, preliminary target localization, and accurate target identification. In the first module, a sea-sky background dictionary is learned from a large number of training samples, which has a good ability to model the cluttered sea-sky background. In the second module, given a test image, it is first divided into a set of patches; then, for each image patch, its sparse representation coefficients are computed over the learned dictionary. By analyzing the sparse reconstruction errors for the image patches, the target candidate areas can be predicted. In the third module, an infrared dim target recognition scheme is applied to those areas to recognize the true dim IR targets. Based on a set of comprehensive experiments, our algorithm has demonstrated better performance than several other infrared dim target detection methods.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Infrared Physics & Technology - Volume 71, July 2015, Pages 347-355
Journal: Infrared Physics & Technology - Volume 71, July 2015, Pages 347-355
نویسندگان
Xin Wang, Siqiu Shen, Chen Ning, Mengxi Xu, Xijun Yan,