کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
6863236 | 678053 | 2016 | 26 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
A divide-and-combine method for large scale nonparallel support vector machines
ترجمه فارسی عنوان
یک روش تقسیم و ترکیب برای دستگاه های بردار پشتیبانی غیرمعمول در مقیاس بزرگ
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
ماشین بردار پشتیبانی، دستگاه بردار غیر عادی، مقیاس بزرگ، خوشه بندی تقسیم کردن، ترکیب کردن
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
هوش مصنوعی
چکیده انگلیسی
Nonparallel Support Vector Machine (NPSVM) which is more flexible and has better generalization than typical SVM is widely used for classification. Although some methods and toolboxes like SMO and libsvm for NPSVM are used, NPSVM is hard to scale up when facing millions of samples. In this paper, we propose a divide-and-combine method for large scale nonparallel support vector machine (DCNPSVM). In the division step, DCNPSVM divide samples into smaller sub-samples aiming at solving smaller subproblems independently. We theoretically and experimentally prove that the objective function value, solutions, and support vectors solved by DCNPSVM are close to the objective function value, solutions, and support vectors of the whole NPSVM problem. In the combination step, the sub-solutions combined as initial iteration points are used to solve the whole problem by global coordinate descent which converges quickly. In order to balance the accuracy and efficiency, we adopt a multi-level structure which outperforms state-of-the-art methods. Moreover, our DCNPSVM can tackle unbalance problems efficiently by tuning the parameters. Experimental results on lots of large data sets show the effectiveness of our method in memory usage, classification accuracy and time consuming.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 75, March 2016, Pages 12-21
Journal: Neural Networks - Volume 75, March 2016, Pages 12-21
نویسندگان
Yingjie Tian, Xuchan Ju, Yong Shi,