کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
10127103 1645032 2018 17 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
The Vapnik-Chervonenkis dimension of graph and recursive neural networks
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
The Vapnik-Chervonenkis dimension of graph and recursive neural networks
چکیده انگلیسی
The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classification model and it is often used as an indicator for the generalization capability of a learning method. The VC-dim has been studied on common feed-forward neural networks, but it has yet to be studied on Graph Neural Networks (GNNs) and Recursive Neural Networks (RecNNs). This paper provides upper bounds on the order of growth of the VC-dim of GNNs and RecNNs. GNNs and RecNNs are from a new class of neural network models which are capable of processing inputs that are given as graphs. A graph is a data structure that generalizes the representational power of vectors and sequences, via the ability to represent dependencies or relationships between feature vectors. It was shown previously that the ability of recurrent neural networks to process sequences increases the VC-dim when compared to the VC-dim of Neural Networks, which are limited to processing vectors. Since graphs are a more general form than sequences, the question arises how this will affect the VC-dimension of GNNs and RecNNs. A main finding in this paper is that the upper bounds on the VC-dim for GNNs and RecNNs are comparable to the upper bounds for recurrent neural networks. The result also suggests that the generalization capability of such models increases with the number of connected nodes.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 108, December 2018, Pages 248-259
نویسندگان
, , ,