Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4947240 | Neurocomputing | 2017 | 23 Pages |
Abstract
This paper presents a method for determining how close a neural network (and its design) is to the systems it is intended to represent and model. The output of this method is a normalized numerical value (an index) which can be compared to other indices. It can also be used to compare one artificial neural network with another. The objective is to develop a numerical index of quality that would make it possible to compare a variety of methods and computational techniques. A brief revisit to the concept from which this paper originated is presented in section 2 (where authors summarize the Statistical Index of Quality), and the outline of the Neural Network Index of Quality is shown in subsection 3.2. Subsection 3.3 illustrates sample calculations and comparisons with other similar methods for neural network evaluation. Finally, section 4 addresses certain implications and outlines future research in pursuit of automated quality analysis of computational techniques. As proposed, the method is effective to compare different computational techniques adequately, with results close to '1' for well-designed networks and results close to '0' for neural networks with possible design flaws. The proposed method is designed for fixed-size, supervised artifical neural networks.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Ricardo E. Monge, Juan L. Crespo,