کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
11012396 | 1800229 | 2018 | 13 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
One-two-one networks for compression artifacts reduction in remote sensing
ترجمه فارسی عنوان
شبکه های دو نفره برای کاهش آثار فشرده سازی در سنجش از راه دور
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
00-01، 99-00، کاهش آثار فشرده، سنجش از دور، یادگیری عمیق، یک شبکه دو نفره،
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
سیستم های اطلاعاتی
چکیده انگلیسی
Compression artifacts reduction (CAR) is a challenging problem in the field of remote sensing. Most recent deep learning based methods have demonstrated superior performance over the previous hand-crafted methods. In this paper, we propose an end-to-end one-two-one (OTO) network, to combine different deep models, i.e., summation and difference models, to solve the CAR problem. Particularly, the difference model motivated by the Laplacian pyramid is designed to obtain the high frequency information, while the summation model aggregates the low frequency information. We provide an in-depth investigation into our OTO architecture based on the Taylor expansion, which shows that these two kinds of information can be fused in a nonlinear scheme to gain more capacity of handling complicated image compression artifacts, especially the blocking effect in compression. Extensive experiments are conducted to demonstrate the superior performance of the OTO networks, as compared to the state-of-the-arts on remote sensing datasets and other benchmark datasets. The source code will be available here: https://github.com/bczhangbczhang/.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: ISPRS Journal of Photogrammetry and Remote Sensing - Volume 145, Part A, November 2018, Pages 184-196
Journal: ISPRS Journal of Photogrammetry and Remote Sensing - Volume 145, Part A, November 2018, Pages 184-196
نویسندگان
Baochang Zhang, Jiaxin Gu, Chen Chen, Jungong Han, Xiangbo Su, Xianbin Cao, Jianzhuang Liu,