کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
398027 | 1438434 | 2016 | 14 صفحه PDF | دانلود رایگان |
• We introduced a new class of parametric models, the PICI models.
• We focused on models with decomposable combination functions.
• An empirical study showed that such decompositions lead to faster inference.
• The likelihood function can be used to select the best fitted decomposition.
A major difficulty in building Bayesian network (BN) models is the size of conditional probability tables, which grow exponentially in the number of parents. One way of dealing with this problem is through parametric conditional probability distributions that usually require only a number of parameters that is linear in the number of parents. In this paper, we introduce a new class of parametric models, the Probabilistic Independence of Causal Influences (PICI) models, that aim at lowering the number of parameters required to specify local probability distributions, but are still capable of efficiently modeling a variety of interactions. A subset of PICI models is decomposable and this leads to significantly faster inference as compared to models that cannot be decomposed. We present an application of the proposed method to learning dynamic BNs for modeling a woman's menstrual cycle. We show that PICI models are especially useful for parameter learning from small data sets and lead to higher parameter accuracy than when learning CPTs.
Journal: International Journal of Approximate Reasoning - Volume 70, March 2016, Pages 123–136