Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1153051 | Statistics & Probability Letters | 2010 | 8 Pages |
Abstract
Jiang and Tanner (2008) consider a method of classification using the Gibbs posterior which is directly constructed from the empirical classification errors. They propose an algorithm to sample from the Gibbs posterior which utilizes a smoothed approximation of the empirical classification error, via a Gibbs sampler with augmented latent variables. In this paper, we note some drawbacks of this algorithm and propose an alternative method for sampling from the Gibbs posterior, based on the Metropolis algorithm. The numerical performance of the algorithms is examined and compared via simulated data. We find that the Metropolis algorithm produces good classification results at an improved speed of computation.
Related Topics
Physical Sciences and Engineering
Mathematics
Statistics and Probability
Authors
Kun Chen, Wenxin Jiang, Martin A. Tanner,