کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
6856132 1437946 2018 37 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Randomized mixture models for probability density approximation and estimation
ترجمه فارسی عنوان
مدل های مخلوط تصادفی برای تقریب چگالی احتمال و برآورد
کلمات کلیدی
برآورد تراکم، الگوریتم به حداکثر رساندن انتظارات، تقسیم کارکردی، مدل های مخلوط، شبکه های عصبی، بردار تصادفی شبکه های عملکردی-لینک
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی
Neural networks (NNs) with random weights are an interesting alternative to conventional NNs that are used more for data modeling. The random vector functional-link (RVFL) network is an established and theoretically well-grounded randomized learner. A key theoretical result for RVFL networks is that they provide universal approximation for continuous maps, in expectation, with respect to the square-integral norm. We specialize and modify this result, and show that RFVL networks can provide functional approximations that converge in Kullback-Leibler divergence, when the target function is a probability density function. Expanding on the approximation results, we demonstrate the RFVL networks lead to a simple randomized mixture model (MM) construction for density estimation from sample data. An expectation-maximization (EM) algorithm is derived for the maximum likelihood estimation of our randomized MM. The EM algorithm is proved to be globally convergent and the maximum likelihood estimator is proved to be consistent. A set of simulation studies is given to provide empirical evidence towards our approximation and density estimation results.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Information Sciences - Volume 467, October 2018, Pages 135-148
نویسندگان
, , ,