Article ID Journal Published Year Pages File Type
536168 Pattern Recognition Letters 2007 12 Pages PDF
Abstract

In this paper, we evaluate a new ensemble schema for regression, where the ensemble is composed of a number of models where each model is built using feature sampled data using a learning algorithm drawn from a set of simple and stable learning algorithms, and the ensemble integration method is Stacking. We evaluate this schema referred to as non-strict heterogeneous Stacking to a number of baseline methods and to strict heterogeneous Stacking, which uses the same number of models as there are base learning algorithms, built using un-sampled data. We demonstrate that non-strict Stacking for the set of base learning algorithms evaluated, strongly outperformed the baseline methods. In addition the added flexibility of non-strict Stacking, allowed it both to outperform strict Stacking and homogeneous Stacking for the same set of base learning algorithms considered. We discuss the conditions in general where non-strict heterogeneous Stacking is likely to be advantageous.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,