Article ID Journal Published Year Pages File Type
4969246 Journal of Visual Communication and Image Representation 2017 10 Pages PDF
Abstract
Kernel dependence maximization for multi-label dimensionality reduction (kMDDM) has been proposed recently to cope with high-dimensional multi-label data. In order to produce discriminant projection vectors, kMDDM utilize the Hilbert-Schmidt independence criterion to capture the dependence between the feature description and the associated labels. However, the computation of kMDDM involves dense matrices eigen-decomposition that is known to be computationally expensive for large scale problems. In this paper, we reformulate the original kMDDM as a least-squares problem, so as to significantly lessen computational burden by utilizing the conjugate gradient algorithms. Further, appealing regularization techniques can be incorporated into the least-squares model to boost the generalization performance. Extensive experiments conducted on benchmark data collections verify the effectiveness of our proposed model.
Keywords
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,