Article ID Journal Published Year Pages File Type
406109 Neurocomputing 2015 9 Pages PDF
Abstract

This paper presents an identifying function (IF) approach for determining parameter structure of statistical learning machines (SLMs). This involves studying three related aspects: structural identifiability (SI), parameter redundancy (PR) and reparameterization. Firstly, by employing the Rank Theorem in Riemann geometry, we derive an efficient identifiability criterion by calculating the rank of the derivative matrix (DM) of IF. Secondly, we extend the previous concept of IF to local IF (LIF) for examining local parameter structure of SLMs, and prove that the Kullback–Leibler divergence (KLD) is such a proper LIF, thus relating the LIF approach to several existing criteria. Lastly, an analytical approach for solving minimal reparameterization in parameter-redundant models is established. The dimensionality of the minimal reparameterization can be used to characterize the intrinsic parameter dimensionality of model. We compare the IF approach with existing criteria and discuss its pros/cons from theoretical and application viewpoints. Several model examples from the literature are presented to study their parameter structure.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,