Article ID Journal Published Year Pages File Type
530736 Pattern Recognition 2008 9 Pages PDF
Abstract

We present a learning algorithm for nominal vector data. It builds a complex classifier by adding iteratively a simple function that modifies the current classifier. In order to limit overtraining problem we focus on a class of such functions for which optimal Bayesian learning is tractable. We investigate a few classes of functions that yield to models that are similar to Naı¨ve Bayes and logistic classification. We report experimental results for a collection of standard data sets that show that our learning algorithm outperforms standard learning of such these standard models.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,