کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
1149654 957891 2012 13 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Regularized least-squares regression: Learning from a β-mixingβ-mixing sequence
موضوعات مرتبط
مهندسی و علوم پایه ریاضیات ریاضیات کاربردی
پیش نمایش صفحه اول مقاله
Regularized least-squares regression: Learning from a β-mixingβ-mixing sequence
چکیده انگلیسی

We analyze the rate of convergence of the estimation error in regularized least-squares regression when the data is exponentially β-mixingβ-mixing. The results are proven under the assumption that the metric entropy of the balls in the chosen function space grows at most polynomially. In order to prove our main result, we also derive a relative deviation concentration inequality for β-mixingβ-mixing processes, which might be of independent interest. The other major techniques that we use are the independent-blocks technique and the peeling device. An interesting aspect of our analysis is that in order to obtain fast rates we have to make the block sizes dependent on the layer of peeling. With this approach, up to a logarithmic factor, we recover the optimal minimax rates available for the i.i.d. case. In particular, our rate asymptotically matches the optimal rate of convergence when the regression function belongs to a Sobolev space.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Statistical Planning and Inference - Volume 142, Issue 2, February 2012, Pages 493–505
نویسندگان
, ,