Article ID Journal Published Year Pages File Type
6854862 Expert Systems with Applications 2018 27 Pages PDF
Abstract
Most machine learning algorithms possess hyperparameters. For example, an artificial neural network requires the determination of the number of hidden layers, nodes, and many other parameters related to the model fitting process. Despite this, there is still no clear consensus on how to tune them. The most popular methodology is an exhaustive grid search, which can be highly inefficient and sometimes infeasible. Another common solution is to change one hyperparameter at a time and measure its effect on the model's performance. However, this can also be inefficient and does not guarantee optimal results since it ignores interactions between the hyperparameters. In this paper, we propose to use the Design of Experiments (DOE) methodology (factorial designs) for screening and Response Surface Methodology (RSM) to tune a machine learning algorithm's hyperparameters. An application of our methodology is presented with a detailed discussion of the results of a random forest case-study using a publicly available dataset. Benefits include fewer training runs, better parameter selection, and a disciplined approach based on statistical theory.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,