Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1144870 | Journal of the Korean Statistical Society | 2011 | 17 Pages |
The time-continuous discrete-state Markov process is a model for rating transitions. One parameter, namely the intensity to migrate to an adjacent rating state, implies an ordinal rating to have an intuitive metric. State-specific intensities generalize such state-stationarity. Observing Markov processes from a multiplicative intensity model, the maximum likelihood parameter estimators for both models can be studied with the score statistic, written as a martingale transform of the processes that count transitions between the rating states. A Taylor expansion reveals consistency and asymptotic normality of the parameter estimates, resulting in a χ2χ2-distributed likelihood ratio of state-stationarity against the state-specific model. This extends to time-stationarity. Simulations contrast the asymptotic results with finite samples. An application to a sufficiently large set of credit rating histories shows that the one-parameter model can be a good starting point.