Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
406474 | Neurocomputing | 2014 | 9 Pages |
•A BPNN is an effective non-linear forecasting model, but usually not robust when it is applied to small data sets.•Face to this problem, this study tries to use a virtual sample (VSG) method to stabilize the constructed BPNN small-data-set models.•This study develops a new VSG method, which explores the integrated effects of attributes, while earlier methods usually deal with the attributes independently.•This approach determines the acceptable range in each attribute by using MTD functions, and then uses a genetic algorithm (GA) to find a number of the most-feasible virtual samples.
While back-propagation neural networks (BPNN) are effective learning tools for building non-linear models, they are often unstable when using small-data-sets. Therefore, in order to solve this problem, we construct artificial samples, called virtual samples, to improve the learning robustness. This research develops a novel method of virtual sample generation (VSG), named genetic algorithm-based virtual sample generation (GABVSG), which considers the integrated effects and constraints of data attributes. We first determine the acceptable range by using mega-trend diffusion (MTD) functions, and construct the feasibility-based programming (FBP) model with BPNN. A genetic algorithm (GA) is then applied to accelerate the generation of the most-feasible virtual samples. Finally, we use two real cases to verify the performance of the proposed method by comparing the results with two well-known forecasting models, BPNN, support vector machine for regression (SVR) and one newly published approach MTD method [1]. The experimental results indicate that the performance of the GABVSG method is superior to that of using original training data without artificial samples. Consequently, the proposed method can improve learning performance significantly when working with small samples.