Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6865807 | Neurocomputing | 2015 | 6 Pages |
Abstract
In this paper, least trimmed squares (LTS) estimators, frequently used in robust (or resistant) linear parametric regression problems, will be generalized to nonparametric LTS neural networks for nonlinear regression problems. Emphasis is put particularly on the robustness against outliers. This provides alternative learning machines when faced with general nonlinear learning problems. Simple weight updating rules based on gradient descent and iteratively reweighted least squares (IRLS) algorithms will be provided. The important parameter of trimming percentage for the data at hand can be determined by cross validation. Some simulated and real-world data will be used to illustrate the use of LTS neural networks. We will compare the robustness against outliers for usual neural networks with least squares criterion and the proposed LTS neural networks. Simulation results show that the LTS neural networks proposed in this paper have good robustness against outliers.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Yih-Lon Lin, Jer-Guang Hsieh, Jyh-Horng Jeng, Wen-Chin Cheng,