کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
536627 | 870586 | 2009 | 9 صفحه PDF | دانلود رایگان |
In this paper, we describe a Bayesian classification method that informatively combines diverse sources of information and multiple feature spaces for multiclass problems. The proposed method is based on recent advances in kernel approaches where the integration of multiple object descriptors, or feature spaces, is achieved via kernel combination. Each kernel constructs a similarity metric between objects in a particular feature space and then having a common metric across modalities an overall combination can be constructed. We follow a hierarchical Bayesian approach, which introduces prior distributions over random variables and we construct a Gibbs sampling Markov chain Monte Carlo (MCMC) solution which is naturally derived from the employed multinomial probit likelihood. The methodology is the basis for possible deterministic approximations such as variational or maximum-a-posteriori estimators, and it is compared against the well-known classifier combination methods on the classification of handwritten numerals. The results of the proposed method show a significant improvement over the best individual classifier and match the performance of the best multiple classifier combination, whilst reducing the computational requirements of combining classifiers and offering additional information on the significance of the contributing sources.
Journal: Pattern Recognition Letters - Volume 30, Issue 1, 1 January 2009, Pages 46–54