کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
6865947 | 679603 | 2015 | 11 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Online training and its convergence for faulty networks with multiplicative weight noise
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
هوش مصنوعی
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
A recent article showed that the objective function of the online weight noise injection algorithm is not equal to the training set error of faulty radial basis function (RBF) networks under the weight noise situation (Ho et al., 2010). Hence the online weight noise injection algorithm is not able to optimize the training set error of faulty networks with multiplicative weight noise. This paper proposes an online learning algorithm to tolerate multiplicative weight noise. Two learning rate cases, namely fixed learning rate and adaptive learning rate, are investigated. For the fixed learning rate case, we show that if the learning rate μ is less than 2/(Ïb2+maxiâÏ(xi)|2), then the online algorithm converges, where xi׳s are the training input vectors, Ï2b is the variance of the multiplicative weight noise, Ï(xi)=[Ï1(xi),â¦,ÏM(xi)]T, and Ïj(·) is the output of the j-th RBF node. In addition, as μâ0, the trained weight vector tends to the optimal solution. For the adaptive learning rate case, let the learning rates {μk} be a decreasing sequence and limkââμk=0, where k is the index of learning cycles. We prove that if âk=1âμk=â and âk=1âμk2<â, then the weight vector converges to the optimal solution. Our simulation results show that the performance of the proposed algorithm is better than that of the conventional online approaches, such as the online weight decay and weight noise injection.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 155, 1 May 2015, Pages 53-61
Journal: Neurocomputing - Volume 155, 1 May 2015, Pages 53-61
نویسندگان
Zifa Han, Rui-Bin Feng, Wai Yan Wan, Chi-Sing Leung,