Article ID Journal Published Year Pages File Type
385937 Expert Systems with Applications 2014 9 Pages PDF
Abstract

•Handwritten digit recognition with very low error rates is a demanding problem.•Traditional Back Propagation learning is limited due to local minima and stalling.•There is also the need of building a full and representative learning data set.•We address both problems with affine transformations and input noise annealing.•Dimensionality reduction also helps to decrease the error rate.

Two problems that burden the learning process of Artificial Neural Networks with Back Propagation are the need of building a full and representative learning data set, and the avoidance of stalling in local minima. Both problems seem to be closely related when working with the handwritten digits contained in the MNIST dataset. Using a modest sized ANN, the proposed combination of input data transformations enables the achievement of a test error as low as 0.43%, which is up to standard compared to other more complex neural architectures like Convolutional or Deep Neural Networks.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,