Article ID Journal Published Year Pages File Type
6858761 International Journal of Approximate Reasoning 2018 20 Pages PDF
Abstract
In this paper we propose a scalable importance sampling algorithm for computing Gaussian mixture posteriors in conditional linear Gaussian Bayesian networks. Our contribution is based on using a stochastic gradient ascent procedure taking as input a stream of importance sampling weights, so that a mixture of Gaussians is dynamically updated with no need to store the full sample. The algorithm has been designed following a Map/Reduce approach and is therefore scalable with respect to computing resources. The implementation of the proposed algorithm is available as part of the AMIDST open-source toolbox for scalable probabilistic machine learning (http://www.amidsttoolbox.com).
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , , , ,