Article ID Journal Published Year Pages File Type
6865110 Neurocomputing 2018 48 Pages PDF
Abstract
This paper introduces a new formal test of the significance of neural network inputs. It is simple, accurate, and powerful and is based on a linear relationship between the output of a neural network when all of the input variables are fixed at their mean values-other than the input variable, which is subject to significance testing-and the target values of the network. Simulation results show that as the number of observations increases, the power of the test tends to 1 in all cases, and that the empirical size approaches the nominal size in some cases. The results, based on the ordinary least squares (OLS) estimation of parameters, are very encouraging, but using a heteroscedasticity and autocorrelation consistent covariance matrix and fast double bootstrap improves the speed of the convergence to the nominal size. The test can also be used for nonlinear models with nuisance parameters.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,