Article ID Journal Published Year Pages File Type
377541 Artificial Intelligence in Medicine 2016 10 Pages PDF
Abstract

•Kernel deep stacking networks (KDSNs) are a novel method in biomedical research.•KDSNs belong to the class of supervised deep learning.•They are computationally faster to train than artificial neural networks.•KDSNs require the specification of a large number of tuning parameters.•We propose a new data-driven framework for model selection in KDSNs.•The proposed methodology includes model-based optimization and hill climbing.•No pre-specification of any of the KDSN tuning parameters is required.•Application of the proposed methodology results in a fast tuning procedure.•KDSNs are competitive with other techniques in the field of deep learning.

Background and objectivesKernel deep stacking networks (KDSNs) are a novel method for supervised learning in biomedical research. Belonging to the class of deep learning techniques, KDSNs are based on artificial neural network architectures that involve multiple nonlinear transformations of the input data. Unlike traditional artificial neural networks, KDSNs do not rely on backpropagation algorithms but on an efficient fitting procedure that is based on a series of kernel ridge regression models with closed-form solutions. Although being computationally advantageous, KDSN modeling remains a challenging task, as it requires the specification of a large number of tuning parameters.Methods and materialWe propose a new data-driven framework for parameter estimation, hyperparameter tuning, and model selection in KDSNs. The proposed methodology is based on a combination of model-based optimization and hill climbing approaches that do not require the pre-specification of any of the KDSN tuning parameters. We demonstrate the performance of KDSNs by analyzing three medical data sets on hospital readmission of diabetes patients, coronary artery disease, and hospital costs.ResultsOur numerical studies show that the run-time of the proposed KDSN methodology is significantly shorter than the respective run-time of grid search strategies for hyperparameter tuning. They also show that KDSN modeling is competitive in terms of prediction accuracy with other state-of-the-art techniques for statistical learning.ConclusionsKDSNs are a computationally efficient approximation of backpropagation-based artificial neural network techniques. Application of the proposed methodology results in a fast tuning procedure that generates KDSN fits having a similar prediction accuracy as other techniques in the field of deep learning.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,