کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4948572 1439616 2016 14 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Building support vector machines in the context of regularized least squares
ترجمه فارسی عنوان
ساخت ماشین های بردار پشتیبانی در زمینه حداقل مربعات تصحیح شده
کلمات کلیدی
طبقه بندی داده ها، ماشین آلات بردار پشتیبانی، حداقل مربعات منظم، الگوریتم آموزش سریع، تجزیه چولیسکی،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی
This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 211, 26 October 2016, Pages 129-142
نویسندگان
, , ,