Article ID Journal Published Year Pages File Type
6864308 Neurocomputing 2018 50 Pages PDF
Abstract
Multi-label learning is the problem where each instance is associated with multiple labels simultaneously. Binary Relevance (BR), which comes from the idea of one-vs-all for multi-class classification, is a representative algorithm for multi-label learning. It ignores label correlations and may suffer the class-imbalance problem. However, BR is mostly implemented by decomposing into many independent binary classifiers and learning each individually, making it difficult to extend. Moreover, when extending BR by learning together, the mostly used least squared loss function is more suitable for a regression task rather than a classification task. In this paper, we propose a unified framework implementing linear BR for multi-label learning, which is easy to extend and can also be applied in one-vs-all for multi-class classification. We mainly focus on five popular convex loss functions. Experimental results show that the unified framework achieves competitive performances than traditional implementations and some other well-established algorithms for both multi-label learning and multi-class classification. Furthermore, logistic, exponential and least squared hinge loss functions are more suitable for multi-label learning, while logistic and least squared hinge loss functions are more suitable for multi-class classification.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,