Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
411752 | Neurocomputing | 2015 | 9 Pages |
Multi-label Dimensionality reduction via Dependence Maximization (MDDM) has been proposed recently to cope with high-dimensional multi-label data. MDDM projects the original data onto a lower-dimensional feature space in which the dependence between the feature and the associated class labels is maximized. However, the computation of MDDM involves dense matrices eigen-decomposition that is computationally expensive for the high-dimensional data. In addition, MDDM cannot be guaranteed to capture the correlation between multiple labels, which are highly beneficial to multi-label learning. To efficiently solve MDDM, in this paper we propose a novel framework that does not require any eigen-decomposition of a matrix. Specifically, our algorithm has linear time complexity in the dimensionality of the data set. Further, we show that MDDM can be reformulated as a least-squares problem, enabling us to integrate the shared subspace that can effectively uncover multiple label interactions. Extensive experiments conducted on benchmark data collections verify the effectiveness of our proposed model.