Article ID Journal Published Year Pages File Type
495912 Applied Soft Computing 2012 17 Pages PDF
Abstract

Kernel principal component analysis (KPCA) and kernel linear discriminant analysis (KLDA) are two commonly used and effective methods for dimensionality reduction and feature extraction. In this paper, we propose a KLDA method based on maximal class separability for extracting the optimal features of analog fault data sets, where the proposed KLDA method is compared with principal component analysis (PCA), linear discriminant analysis (LDA) and KPCA methods. Meanwhile, a novel particle swarm optimization (PSO) based algorithm is developed to tune parameters and structures of neural networks jointly. Our study shows that KLDA is overall superior to PCA, LDA and KPCA in feature extraction performance and the proposed PSO-based algorithm has the properties of convenience of implementation and better training performance than Back-propagation algorithm. The simulation results demonstrate the effectiveness of these methods.

► We develop a novel fault diagnosis approach of analog circuits. ► We propose a KLDA method based on maximal class separability. ► A novel PSO-based tuning algorithm is proposed for selecting the neural network structure. ► The improved KLDA is applied to extract effective features for analog fault diagnosis.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
, ,