کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
416016 681272 2010 19 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Model selection with the Loss Rank Principle
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر نظریه محاسباتی و ریاضیات
پیش نمایش صفحه اول مقاله
Model selection with the Loss Rank Principle
چکیده انگلیسی

A key issue in statistics and machine learning is to automatically select the “right” model complexity, e.g., the number of neighbors to be averaged over in kk nearest neighbor (kNN) regression or the polynomial degree in regression with polynomials. We suggest a novel principle–the Loss Rank Principle (LoRP)–for model selection in regression and classification. It is based on the loss rank, which counts how many other (fictitious) data would be fitted better. LoRP selects the model that has minimal loss rank. Unlike most penalized maximum likelihood variants (AIC, BIC, MDL), LoRP depends only on the regression functions and the loss function. It works without a stochastic noise model, and is directly applicable to any non-parametric regressor, like kNN.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computational Statistics & Data Analysis - Volume 54, Issue 5, 1 May 2010, Pages 1288–1306
نویسندگان
, ,