کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
409857 679101 2015 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Efficient approximations of robust soft learning vector quantization for non-vectorial data
ترجمه فارسی عنوان
تقریب کارآیی کوانتیزاسیون بردار نرم افزاری قوی برای داده های غیر بردار
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی

Due to its intuitive learning algorithms and classification behavior, learning vector quantization (LVQ) enjoys a wide popularity in diverse application domains. In recent years, the classical heuristic schemes have been accompanied by variants which can be motivated by a statistical framework such as robust soft LVQ (RSLVQ). In its original form, LVQ and RSLVQ can be applied to vectorial data only, making it unsuitable for complex data sets described in terms of pairwise relations only. In this contribution, we address kernel RSLVQ which extends its applicability to data which are described by a general Gram matrix. While leading to state of the art results, this extension has the drawback that models are no longer sparse, and quadratic training complexity is encountered due to the dependency of the method on the full Gram matrix. In this contribution, we investigate the performance of a speed-up of training by means of low rank approximations of the Gram matrix, and we investigate how sparse models can be enforced in this context. It turns out that an efficient Nyström approximation can be used if data are intrinsically low dimensional, a property which can be efficiently checked by sampling the variance of the approximation prior to training. Further, all models enable sparse approximations of comparable quality as the full models using simple geometric approximation schemes only. We demonstrate the behavior of these approximations in a couple of benchmarks.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 147, 5 January 2015, Pages 96–106
نویسندگان
, , ,