Article ID Journal Published Year Pages File Type
6864504 Neurocomputing 2018 10 Pages PDF
Abstract
Matrix factorization (MF) has been one of the powerful machine learning techniques for collaborative flittering, and it is also widely extended to improve the quality for various tasks. For recommendation tasks, it is noting that a single user or item is actually shown to be sparsely correlated with latent factors extracted by MF, which has not been developed in existing works. Thus, we are focusing on levering sparse representation, as a successful feature learning schema for high dimensional data, into latent factor model. We propose a Sparse LAtent Model (SLAM) based on the ideas of sparse representation and matrix factorization. In SLAM, the item and user representation vectors in the latent space are expected to be sparse, induced by the ℓ1-regularization on those vectors. Besides, we extend a dual graph Lapalacian regularization term to simultaneously integrate both user network and item network knowledge. Also, an iterative optimization method is presented to solve the new learning problem. The experiments on real datasets show that SLAM can predict the user-item ratings better than the state-of-the-art matrix factorization based methods.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,