کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
6940340 | 1450011 | 2018 | 10 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Robust feature selection via l2,1-norm in finite mixture of regression
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
Finite mixture of Gaussian regression (FMR) is a widely-used modeling technique in supervised learning problems. In cases where the number of features is large, feature selection is desirable to enhance model interpretability and to avoid overfitting. In this paper, we propose a robust feature selection method via l2,1-norm penalized maximum likelihood estimation (MLE) in FMR, with extension to sparse l2,1 penalty by combining l1-norm with l2,1-norm for increasing flexibility. To solve the non-convex and non-smooth problem of (sparse) penalized MLE in FMR, we develop an new EM-based algorithm for numerical optimization, with combination of block coordinate descent and majorizing-minimization scheme in M-step. We finally apply our method in six simulations and one real dataset to demonstrate its superior performance.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Pattern Recognition Letters - Volume 108, 1 June 2018, Pages 15-22
Journal: Pattern Recognition Letters - Volume 108, 1 June 2018, Pages 15-22
نویسندگان
Xiangrui Li, Dongxiao Zhu,