Article ID Journal Published Year Pages File Type
536195 Pattern Recognition Letters 2006 9 Pages PDF
Abstract

The well-known mixtures of experts (ME) model has been used in many different areas to account for nonlinearities and other complexities in the data, such as time series prediction. We usually train ME model by expectation maximization (EM) algorithm for maximum likelihood learning. However, the number of experts has to be determined first, which is often hardly known. Derived from regularization theory, a regularized minimum cross-entropy (RMCE) algorithm is proposed to train ME model, which can automatically make model selection. When time series is modeled by ME, it is demonstrated by some climate prediction experiments that RMCE algorithm outperforms EM algorithm. We also compare RMCE algorithm with other regression methods such as back-propagation (BP) and normalized radial basis function (NRBF) networks, and find that RMCE algorithm shows promising results. Moreover, we investigate curve detection problem by ME model with RMCE algorithm, which can detect curves (straight lines or circles) from a binary image. Some simulations and image experiments show that RMCE algorithm can automatically determine the number of straight lines or circles during parameter learning against noise, and in this way our algorithm does better than Hough transform (HT).

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
,