کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
402550 676963 2016 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Cost-sensitive feature selection using random forest: Selecting low-cost subsets of informative features
ترجمه فارسی عنوان
انتخاب ویژگی های حساس با استفاده از جنگ های تصادفی: انتخاب زیر مجموعه های کم هزینه از ویژگی های آموزنده
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی

Feature selection aims to select a small subset of informative features that contain most of the information related to a given task. Existing feature selection methods often assume that all the features have the same cost. However, in many real world applications, different features may have different costs (e.g., different tests a patient might take in medical diagnosis). Ignoring the feature cost may produce good feature subsets in theory but they can not be used in practice. In this paper, we propose a random forest-based feature selection algorithm that incorporates the feature cost into the base decision tree construction process to produce low-cost feature subsets. In particular, when constructing a base tree, a feature is randomly selected with a probability inversely proportional to its associated cost. We evaluate the proposed method on a number of UCI datasets and apply it to a medical diagnosis problem where the real feature costs are estimated by experts. The experimental results demonstrate that our feature-cost-sensitive random forest (FCS-RF) is able to select a low-cost subset of informative features and achieves better performance than other state-of-art feature selection methods in real-world problems.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Knowledge-Based Systems - Volume 95, 1 March 2016, Pages 1–11
نویسندگان
, , ,