Article ID Journal Published Year Pages File Type
11007994 Neurocomputing 2018 32 Pages PDF
Abstract
The goal of representation learning of knowledge graph is to encode both entities and relations into a low-dimensional embedding spaces. Mostly current works have demonstrated the benefits of knowledge graph embedding in single knowledge graph completion, such as relation extraction. The most significant distinction between multiple knowledge graphs embedding and single knowledge graph embedding is that the former must consider the alignments between multiple knowledge graphs which is very helpful to some applications built on multiple KGs, such as KB-QA and KG integration. In this paper, we proposed a new automatic representation learning model over Multiple Knowledge Graphs (MGTransE) by adopting a bootstrapping method. More specifically, MGTransE consists of three core components: Structure Model, Semantically Smooth Embedding Model and Iterative Smoothness Model. The experiment results on two real-world datasets show that our method achieves better performance on two new multiple KGs tasks compared with state-of-the-art KG embedding models and also preserves the key properties of knowledge graph embedding on traditional single KG tasks as compared to those methods learned from single KG.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , , , ,