کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
326393 | 542401 | 2015 | 12 صفحه PDF | دانلود رایگان |

• PDA couples non-parametric methods with MCMC to estimate a model’s posterior.
• Kernel density estimation is used to construct an approximate likelihood.
• Signal processing methods are used to accelerate this process.
• A “resampled MCMC” that improves chain mixing for this method is presented.
• Approximation errors are characterized theoretically and through example.
A critical task in modeling is to determine how well the theoretical assumptions encoded in a model account for observations. Bayesian methods are an ideal framework for doing just this. Existing approximate Bayesian computation (ABC) methods however rely on often insufficient “summary statistics”. Here, I present and analyze a highly efficient extension of the recently proposed (Turner and Sederberg 2014) Probability Density Approximation (PDA) method, which circumvents this insufficiency. This method combines Markov Chain Monte Carlo simulation with tools from non-parametric statistics to improve upon existing ABC methods. The primary contributions of this article are: (1) A more efficient implementation of this method that substantially improves computational performance is described. (2) Theoretical results describing the influence of methodological approximation errors on posterior estimation are discussed. In particular, while this method is highly accurate, even small errors have a strong influence on model comparisons when using standard statistical approaches (such as deviance information criterion). (3) An augmentation of the standard PDA procedure, termed “resampled PDA”, that reduces the negative influence of approximation errors on performance and accuracy, is presented. (4) A number of examples of varying complexity are presented along with supplementary code for their implementation.
Journal: Journal of Mathematical Psychology - Volumes 68–69, October–December 2015, Pages 13–24