Article ID Journal Published Year Pages File Type
694546 Acta Automatica Sinica 2008 8 Pages PDF
Abstract

Different from the conventional evaluation criteria using performance measures, information theory based criteria present a unique beneficial feature in applications of machine learning. However, we are still far from possessing an in-depth understanding of the “entropy” type criteria, say, in relation to the conventional performance-based criteria. This paper studies generic classification problems, which include a rejected, or unknown, class. We present the basic formulas and schematic diagram of classification learning based on information theory. A closed-form equation is derived between the normalized mutual information and the augmented confusion matrix for the generic classification problems. Three theorems and one set of sensitivity equations are given for studying the relations between mutual information and conventional performance indices. We also present numerical examples and several discussions related to advantages and limitations of mutual information criteria in comparison with the conventional criteria.

Related Topics
Physical Sciences and Engineering Engineering Control and Systems Engineering