Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
406254 | Neurocomputing | 2015 | 10 Pages |
•We present that the formulation of multi-label linear discriminant analysis can be equivalently casted as a least squares problem.•Iterative conjugate gradient algorithms for dealing with the least-squares problem can be employed to substantially reduce the computational cost of the original MLDA.•Various appealing regularization techniques can be readily incorporated into the least-squares model to boost generalization performance.
The classical linear discriminant analysis has been recently extended to the multi-label dimensionality reduction. However, Multi-label Linear Discriminant Analysis (MLDA) involves dense matrices eigen-decomposition that is known to be computationally expensive for the large-scale problems. In this paper, we present that the formulation of MLDA can be equivalently casted as a new least-squares framework so as to significantly mitigate the computational overhead and scale to the data collections with higher dimension. Further, it is also found that appealing regularization techniques can be incorporated into the least-squares model to boost generalization accuracy. Experimental results on several popular multi-label benchmarks not only verify the established equivalence relationship, but also corroborate the effectiveness and efficiency of our proposed algorithms.