Article ID Journal Published Year Pages File Type
398027 International Journal of Approximate Reasoning 2016 14 Pages PDF
Abstract

•We introduced a new class of parametric models, the PICI models.•We focused on models with decomposable combination functions.•An empirical study showed that such decompositions lead to faster inference.•The likelihood function can be used to select the best fitted decomposition.

A major difficulty in building Bayesian network (BN) models is the size of conditional probability tables, which grow exponentially in the number of parents. One way of dealing with this problem is through parametric conditional probability distributions that usually require only a number of parameters that is linear in the number of parents. In this paper, we introduce a new class of parametric models, the Probabilistic Independence of Causal Influences (PICI) models, that aim at lowering the number of parameters required to specify local probability distributions, but are still capable of efficiently modeling a variety of interactions. A subset of PICI models is decomposable and this leads to significantly faster inference as compared to models that cannot be decomposed. We present an application of the proposed method to learning dynamic BNs for modeling a woman's menstrual cycle. We show that PICI models are especially useful for parameter learning from small data sets and lead to higher parameter accuracy than when learning CPTs.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,