Article ID Journal Published Year Pages File Type
1145095 Journal of the Korean Statistical Society 2008 21 Pages PDF
Abstract

There are essentially two statistical paradigms, the Bayesian and frequentist. Despite their obvious differences the two approaches have certain points in common. In particular both are density (or likelihood) based and neither has a concept of approximation. By a concept of approximation we mean some formal admission of the fact that the statistical models are not true representations of the data. We argue that the relationship between the data and the model is a fundamental one which cannot be reduced to either diagnostics or model validation. We argue further that a concept of approximation must be formulated in a weak topology different from the strong topology of densities. For this reason there can be no density or likelihood based concept of approximation. The concept of approximation we suggest goes back to [Donoho, D. L. (1988). One-sided inference about functionals of a density. Annals of Statistics, 16, 1390–1420] and [Davies, P. L. (1995). Data features. Statistica Neerlandica, 49, 185–245] and requires ‘typical’ data sets simulated under the model ‘look like’ the real data set. This idea is developed using examples from nonparametric regression.

Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
,