کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
6870487 | 681394 | 2014 | 13 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Hypercube estimators: Penalized least squares, submodel selection, and numerical stability
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
نظریه محاسباتی و ریاضیات
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
Hypercube estimators for the mean vector in a general linear model include algebraic equivalents to penalized least squares estimators with quadratic penalties and to submodel least squares estimators. Penalized least squares estimators necessarily break down numerically for certain penalty matrices. Equivalent hypercube estimators resist this source of numerical instability. Under conditions, adaptation over a class of candidate hypercube estimators, so as to minimize the estimated quadratic risk, also minimizes the asymptotic risk under the general linear model. Numerical stability of hypercube estimators assists trustworthy adaptation. Hypercube estimators have broad applicability to any statistical methodology that involves penalized least squares. Notably, they extend to general designs the risk reduction achieved by Stein's multiple shrinkage estimators for balanced observations on an array of means.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computational Statistics & Data Analysis - Volume 71, March 2014, Pages 654-666
Journal: Computational Statistics & Data Analysis - Volume 71, March 2014, Pages 654-666
نویسندگان
Rudolf Beran,