کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534589 | 870269 | 2013 | 10 صفحه PDF | دانلود رایگان |

The design of binary W-operators, morphological operators that are translation-invariant and locally defined by a finite neighborhood window, corresponds to the problem of designing Boolean functions, or their characteristic functions. One of the main issues regarding the automatic design of W-operators, based on samples, is the one of generalization. Considering the designing of W-operators as a particular case of designing a pattern recognition system, in this paper we propose a new approach for the automatic design of binary W-operators. The approach consists on a functional representation of the class membership conditional probability for the whole set of patterns viewed through a given window, instead of generalizing the class labels (or the characteristic function values). The estimation of parameters for the functional representation uses a nonlinear regression performed by an artificial feed-forward neural network. The network training is based on the weighted mean square error cost function, allowing us to use the marginal probability of each pattern viewed by a given window. Experimental results, consisting on noise filtering in images of retinal angiographies, edge detection in noise images, texture identification and character recognition, show that the proposed approach outperforms not only pyramidal multiresolution, the best existing method for generalization of characteristic functions of W-operators, but also classical classifiers based on support vector machines, k-nearest neighbor and convolutional neural networks.
► A new generalization to automatically design window operators (W-operators) is proposed.
► The generalization is achieved by using artificial feed-forward neural networks.
► Our approach simplifies the computational representation of W-operators.
► Experimental results show that our approach performs better than pyramidal multiresolution.
► Our approach also outperforms SVM, kNN and convolutional neural networks classifiers.
Journal: Pattern Recognition Letters - Volume 34, Issue 9, 1 July 2013, Pages 970–979