کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
6865235 1439555 2018 8 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
L1/2 regularization learning for smoothing interval neural networks: Algorithms and convergence analysis
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
L1/2 regularization learning for smoothing interval neural networks: Algorithms and convergence analysis
چکیده انگلیسی
Interval neural networks can easily address uncertain information, since they are capable of handling various kinds of uncertainties inherently which are represented by interval. Lq (0 < q < 1) regularization was proposed after L1 regularization for better solution of sparsity problems, among which L1/2 is of extreme importance and can be taken as a representative. However, weights oscillation might occur during learning process due to discontinuous derivative for L1/2 regularization. In this paper, a novel batch gradient algorithm with smoothing L1/2 regularization is proposed to prevent the weights oscillation for a smoothing interval neural network (SINN), which is the modified interval neural network. Here, by smoothing we mean that, in a neighborhood of the origin, we replace the absolute values of the weights by a smooth function for continuous derivative. Compared with conventional gradient learning algorithm with L1/2 regularization, this approach can obtain sparser weights and simpler structure, and improve the learning efficiency. Then we present a sufficient condition for convergence of SINN. Finally, simulation results illustrate the convergence of the main results.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 272, 10 January 2018, Pages 122-129
نویسندگان
, ,