Article ID Journal Published Year Pages File Type
417281 Computational Statistics & Data Analysis 2008 21 Pages PDF
Abstract

A minimum disparity estimator minimizes a φφ-divergence between the marginal density of a parametric model and its non-parametric estimate. This principle is applied to the estimation of stochastic differential equation models, choosing the Hellinger distance as particular φφ-divergence. Under an hypothesis of stationarity, the parametric marginal density is provided by solving the Kolmogorov forward equation. A particular emphasis is put on the non-parametric estimation of the sample marginal density which has to take into account sample dependence and kurtosis. A new window size determination is provided. The classical estimator is presented alternatively as a distance minimizer and as a pseudo-likelihood maximizer. The latter presentation opens the way to Bayesian inference. The method is applied to continuous time models of the interest rate. In particular, various models are tested using alternatively tests and their results are discussed.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,