کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
409771 679090 2015 7 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Learning a hyperplane classifier by minimizing an exact bound on the VC dimension1
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
Learning a hyperplane classifier by minimizing an exact bound on the VC dimension1
چکیده انگلیسی


• We learn a hyperplane classifier by minimizing an exact bound on its VC dimension.
• A fractional programming problem is formulated and reduced to a LP problem.
• Linear and kernel versions of the approach are explored.
• The approach, called the Minimal Complexity Machine, generalizes better than SVMs.
• On numerous benchmark datasets, the MCM uses far fewer support vectors than SVMs.

The VC dimension measures the complexity of a learning machine, and a low VC dimension leads to good generalization. While SVMs produce state-of-the-art learning performance, it is well known that the VC dimension of a SVM can be unbounded; despite good results in practice, there is no guarantee of good generalization. In this paper, we show how to learn a hyperplane classifier by minimizing an exact, or ΘΘ bound on its VC dimension. The proposed approach, termed as the Minimal Complexity Machine (MCM), involves solving a simple linear programming problem. Experimental results show, that on a number of benchmark datasets, the proposed approach learns classifiers with error rates much less than conventional SVMs, while often using fewer support vectors. On many benchmark datasets, the number of support vectors is less than one-tenth the number used by SVMs, indicating that the MCM does indeed learn simpler representations.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 149, Part B, 3 February 2015, Pages 683–689
نویسندگان
,