Article ID Journal Published Year Pages File Type
494861 Applied Soft Computing 2016 13 Pages PDF
Abstract

•Different from the traditional multi-label feature selection, the proposed algorithm derives from different cognitive viewpoints.•A simple and intuitive metric to evaluate the candidate features is proposed.•The proposed algorithm is applicable to both categorical and numerical features.•Our proposed method outperforms some other state-of-the-art multi-label feature selection methods in our experiments.

Multi-label learning deals with data associated with a set of labels simultaneously. Like traditional single-label learning, the high-dimensionality of data is a stumbling block for multi-label learning. In this paper, we first introduce the margin of instance to granulate all instances under different labels, and three different concepts of neighborhood are defined based on different cognitive viewpoints. Based on this, we generalize neighborhood information entropy to fit multi-label learning and propose three new measures of neighborhood mutual information. It is shown that these new measures are a natural extension from single-label learning to multi-label learning. Then, we present an optimization objective function to evaluate the quality of the candidate features, which can be solved by approximating the multi-label neighborhood mutual information. Finally, extensive experiments conducted on publicly available data sets verify the effectiveness of the proposed algorithm by comparing it with state-of-the-art methods.

Graphical abstractFigure optionsDownload full-size imageDownload as PowerPoint slide

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
, , , , ,