کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
407902 | 678237 | 2013 | 13 صفحه PDF | دانلود رایگان |

We describe a new technique for sequential data analysis, called GDTW-P-SVMs. It is a maximum margin method for the construction of classifiers with variable-length input series. It employs potential support vector machines (P-SVMs) and Gaussian Dynamic Time Warping (GDTW) to waive the fixed-length restriction of feature vectors in training and test data. As a result, GDTW-P-SVMs enjoy the P-SVM method's properties such as the ability to: (i) handle data and kernel matrices that are neither positive definite nor square and (ii) minimise a scale-invariant capacity measure. The new technique elaborates on the P-SVM kernel functions, by utilising the well-known dynamic time warping algorithm to provide an elastic distance measure for the kernel functions. Benchmarks for classification are performed with several real-world data sets from the UCR time series classification/clustering page, the GeoLife trajectory data set, and the UCI Machine Learning Repository. The data sets include data with both variable and fixed-length input series. The results show that the new method performs significantly better than the benchmarked standard classification methods.
Journal: Neurocomputing - Volume 99, 1 January 2013, Pages 270–282