Article ID Journal Published Year Pages File Type
408965 Neurocomputing 2016 15 Pages PDF
Abstract

Due to the theoretical advances and empirical successes, Multi-task Learning (MTL) has become a popular design paradigm for training a set of tasks jointly. Through exploring the hidden relationships among multiple tasks, many MTL algorithms have been developed to enhance learning performance. In general, the complicated hidden relationships can be considered as a combination of two key structural elements: task grouping and task outlier. Based on such task relationship, here we propose a generic MTL framework with flexible structure regularization  , which aims in relaxing any type of specific structure assumptions. In particular, we directly impose a joint ℓ11/ℓ21ℓ11/ℓ21-norm as the regularization term to reveal the underlying task relationship in a flexible way. Such a flexible structure regularization term takes into account any convex combination of grouping and outlier structural characteristics among the multiple tasks. In order to derive efficient solutions for the generic MTL framework, we develop two algorithms, i.e., the Iteratively Reweighted Least Square (IRLS) method and the Accelerated Proximal Gradient (APG) method, with different emphasis and strength. In addition, the theoretical convergence and performance guarantee are analyzed for both algorithms. Finally, extensive experiments over both synthetic and real data, and the comparisons with several state-of-the-art algorithms demonstrate the superior performance of the proposed generic MTL method.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,