Article ID Journal Published Year Pages File Type
9653562 Neurocomputing 2005 13 Pages PDF
Abstract
We apply support vector learning to attributed graphs where the kernel matrices are based on approximations of the Schur-Hadamard inner product. The evaluation of the Schur-Hadamard inner product for a pair of graphs requires the determination of an optimal match between their nodes and edges. It is therefore efficiently approximated by means of recurrent neural networks. The optimal mapping involved allows a direct understanding of the similarity or dissimilarity of the two graphs considered. We present and discuss experimental results of different classifiers constructed by a SVM operating on positive semi-definite (psd) and non-psd kernel matrices.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,