کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
530308 | 869756 | 2012 | 11 صفحه PDF | دانلود رایگان |

Traditional parametric and nonparametric classifiers used for statistical pattern recognition have their own strengths and limitations. While parametric methods assume some specific parametric models for density functions or posterior probabilities of competing classes, nonparametric methods are free from such assumptions. So, when these model assumptions are correct, parametric methods outperform nonparametric classifiers, especially when the training sample is small. But, violations of these assumptions often lead to poor performance by parametric classifiers, where nonparametric methods work well. In this article, we make an attempt to overcome these limitations of parametric and nonparametric approaches and combine their strengths. The resulting classifiers, denoted the hybrid classifiers, perform like parametric classifiers when the model assumptions are valid, but unlike parametric classifiers, they also provide safeguards against possible deviations from parametric model assumptions. In this article, we propose some multiscale methods for hybrid classification, and their performance is evaluated using several simulated and benchmark data sets.
► Proposed hybrid classifiers aggregate parametric and nonparametric classifiers.
► Perform like parametric classifiers when model assumptions are correct.
► Provide safeguards against model mis-specifications.
► Multiscale versions are computationally efficient and perform better than stacking.
► Overall performance is better than parametric and nonparametric classifiers.
Journal: Pattern Recognition - Volume 45, Issue 6, June 2012, Pages 2288–2298