Article ID Journal Published Year Pages File Type
4969496 Pattern Recognition 2018 32 Pages PDF
Abstract
Multi-task learning (MTL) has been proved to improve performance of individual tasks by learning multiple related tasks together. Recently Nonparametric Bayesian Gaussian Process (GP) models have also been adapted to MTL and exhibit enough flexibility due to its non-parametric nature, thus can exempt from the assumption about the probability distributions of variables. To date, there have had two approaches proposed to implement GP-based MTL, i.e., cross-covariance-based and joint feature learning methods. Although successfully applied in scenarios such as face verification and collaborative filtering, these methods have their own drawbacks, for example, the cross-covariance-based method suffers from poor scalability because of the large covariance matrix involved; while the joint feature learning method can just implicitly incorporate relation between tasks, thus leading to a failure in explicitly exploiting the prior knowledge like correlation between tasks, which is crucial for further promoting MTLs. To address both issues, in this paper, we establish a two layer unified framework called Hierarchical Gaussian Process Multi-task Learning (HGPMT) method to jointly learn the latent shared features among tasks and a multi-task model. Furthermore, since the HGPMT does not need to involve the cross-covariance, its computational complexity is much lower. Finally, experimental results on both toy multi-task regression dataset and real datasets demonstrate its superiority in performance of multi-task learning to recently proposed approaches.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,