کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
10326071 | 677481 | 2005 | 7 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
On the relationship between deterministic and probabilistic directed Graphical models: From Bayesian networks to recursive neural networks
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
هوش مصنوعی
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
Machine learning methods that can handle variable-size structured data such as sequences and graphs include Bayesian networks (BNs) and Recursive Neural Networks (RNNs). In both classes of models, the data is modeled using a set of observed and hidden variables associated with the nodes of a directed acyclic graph. In BNs, the conditional relationships between parent and child variables are probabilistic, whereas in RNNs they are deterministic and parameterized by neural networks. Here, we study the formal relationship between both classes of models and show that when the source nodes variables are observed, RNNs can be viewed as limits, both in distribution and probability, of BNs with local conditional distributions that have vanishing covariance matrices and converge to delta functions. Conditions for uniform convergence are also given together with an analysis of the behavior and exactness of Belief Propagation (BP) in 'deterministic' BNs. Implications for the design of mixed architectures and the corresponding inference algorithms are briefly discussed.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 18, Issue 8, October 2005, Pages 1080-1086
Journal: Neural Networks - Volume 18, Issue 8, October 2005, Pages 1080-1086
نویسندگان
Pierre Baldi, Michal Rosen-Zvi,