کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
531957 | 869890 | 2016 | 10 صفحه PDF | دانلود رایگان |
• Smoothly approximates the objective function of support vector domain description.
• Gradient based optimization method could be applied to the smoothed model.
• The proposed algorithms have asymptotic training complexity (n2)(n2).
• The algorithm is easy to implement, without requiring any optimization package.
Support vector domain description (SVDD) is a well-known tool for pattern analysis when only positive examples are reliable. The SVDD model is often fitted by solving a quadratic programming problem, which is time consuming. This paper attempts to fit SVDD in the primal form directly. However, the primal objective function of SVDD is not differentiable which prevents the well-behaved gradient based optimization methods from being applicable. As such, we propose to approximate the primal objective function of SVDD by a differentiable function, and a conjugate gradient method is applied to minimize the smoothly approximated objective function. Extensive experiments on pattern classification were conducted, and compared to the quadratic programming based SVDD, the proposed approach is much more computationally efficient and yields similar classification performance on these problems.
Journal: Pattern Recognition - Volume 49, January 2016, Pages 55–64