Article ID Journal Published Year Pages File Type
695744 Automatica 2014 6 Pages PDF
Abstract

For a linear time-invariant system model, this paper analyzes the convergence of parameter estimations as the length of the input–output data tends to infinity through the prediction error method. It is known that the sequence of the criterion functions converges uniformly in the parameter with probability one as the data length tends to infinity. The parameter estimation is represented by a set in general, instead of by a single point, on which the criterion function takes its minimum. Thus a mathematical feature of the convergence problem of parameter estimation is in that we are needed, from the convergence of a sequence of functions, to infer the convergence of the sequence of their sets of minimizing arguments. The Hausdorff metric is suggested to measure the distance between sets and then is used to discuss the convergence problem here. According to the Hausdorff metric, the convergence of parameter estimation is not guaranteed in general. A condition guaranteeing such convergence is given.

Related Topics
Physical Sciences and Engineering Engineering Control and Systems Engineering
Authors
, ,