کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
391830 | 662014 | 2016 | 20 صفحه PDF | دانلود رایگان |
• We prove that boosted regression trees is more expressive than linear and quadratic function for conditional preference.
• We propose to use boosted regression trees to represent conditional preference in recommender system.
• The proposed model is learned by combining gradient boost and coordinate descent.
Widely existing conditional preference is seldom taken into consideration in recommender systems. This may lead to unsatisfying recommendation results. To address this issue, in this paper, we propose to use boosted regression trees to represent conditional preference in recommendation systems, which is more expressive than linear and quadratic function for conditional preference. Compared with the existing conditional preference model, boosted regression trees can process large amounts of data in recommendation systems due to the reasonable storage space and low learning complexity. We integrate boosted regression trees into the framework of matrix factorization, and propose an algorithm combining gradient boosting and coordinate descent to learn the model. The proposed method is evaluated on four real world datasets, and compared with other matrix factorization based state-of-the-art methods. The experimental results show that the proposed method outperforms most of the comparison methods.
Journal: Information Sciences - Volume 327, 10 January 2016, Pages 1–20