Article ID Journal Published Year Pages File Type
5010608 Systems & Control Letters 2017 12 Pages PDF
Abstract
In this paper, an output error method is proposed for the identification of continuous-time systems with time delay from sampled data. The challenge of time delay system identification lies in the presence of nonlinear time delays in models, then starting value-based optimization methods may be trapped easily by local minima. In order to improve the convergence performance to the choice of initial parameters, several approaches to smooth the loss function are presented. It is shown that the loss function may possess many local minima when data are regularly sampled with the inter-sample behavior of zero-order hold. Interestingly, irregular sampling can be an efficient approach to overcome these local minima. To achieve superior convergence performance, an over-parametrization approach incorporating a low-pass filtering technique is proposed to enlarge the convergence region. Theoretical and simulated results are presented to demonstrate the effectiveness of the proposed method.
Related Topics
Physical Sciences and Engineering Engineering Control and Systems Engineering
Authors
, , , ,