Article ID Journal Published Year Pages File Type
4947865 Neurocomputing 2017 13 Pages PDF
Abstract

For most regression problems, the optimal regression model can be obtained by minimizing a loss function, and the selection of loss functions has great effect on the performance of the derived regression model. Squared loss is widely used in regression. It is theoretically optimal for Gaussian noise. However, real data are usually polluted by complex and unknown noise, especially in the era of big data, the noise may not be fitted well by any single distribution. To address the above problem, two novel nonlinear regression models for single-task and multi-task problems are developed in this work, where the noise is fitted by Mixture of Gaussians. It was proved that any continuous distributions can be approximated by Mixture of Gaussians. To obtain the optimal parameters in the proposed models, an iterative algorithm based on Expectation Maximization is designed. The proposed models turn to be a self-adaptive robust nonlinear regression models. The experimental results on synthetic and real-world benchmark datasets show that the proposed models produce good performance compared with current regression algorithms and provide superior robustness.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,