کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
495251 862821 2015 14 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Efficient multi-criteria optimization on noisy machine learning problems
ترجمه فارسی عنوان
بهینه سازی چند معیاره کارآمد در مشکلات یادگیری ماشین پر سر و صدا
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر نرم افزارهای علوم کامپیوتر
چکیده انگلیسی


• The Kriging-based EGO techniques performed better than the baseline LHS approach.
• The use of re-interpolation is crucial to cope with noise.
• Repeats can be necessary but also decrease the number of possible infill points.

Recent research revealed that model-assisted parameter tuning can improve the quality of supervised machine learning (ML) models. The tuned models were especially found to generalize better and to be more robust compared to other optimization approaches. However, the advantages of the tuning often came along with high computation times, meaning a real burden for employing tuning algorithms. While the training with a reduced number of patterns can be a solution to this, it is often connected with decreasing model accuracies and increasing instabilities and noise. Hence, we propose a novel approach defined by a two criteria optimization task, where both the runtime and the quality of ML models are optimized. Because the budgets for this optimization task are usually very restricted in ML, the surrogate-assisted Efficient Global Optimization (EGO) algorithm is adapted. In order to cope with noisy experiments, we apply two hypervolume indicator based EGO algorithms with smoothing and re-interpolation of the surrogate models. The techniques do not need replicates. We find that these EGO techniques can outperform traditional approaches such as latin hypercube sampling (LHS), as well as EGO variants with replicates.

Figure optionsDownload as PowerPoint slide

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Applied Soft Computing - Volume 29, April 2015, Pages 357–370
نویسندگان
, , , , ,