کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
11008001 1840489 2018 39 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
DMP-ELMs: Data and model parallel extreme learning machines for large-scale learning tasks
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
DMP-ELMs: Data and model parallel extreme learning machines for large-scale learning tasks
چکیده انگلیسی
As machine learning applications embrace larger data size and model complexity, practitioners turn to distributed clusters to satisfy the increasing computational and memory demands. Recently, several parallel variants of extreme learning machine (ELM) have been proposed, some of which are based on clusters. However, the limitation of computation and memory in these variants is still not well addressed when both the data and model are very large. Our goal is to build scalable ELMs with a large number of samples and hidden neurons, parallel running on clusters without computational and memory bottlenecks while having the same output results as the sequential ELM. In this paper, we propose two parallel variants of ELM, referred to as local data and model parallel ELM (LDMP-ELM) and global data and model parallel ELM (GDMP-ELM). Both variants are implemented on clusters with Message Passing Interface (MPI) environment. They both make a tradeoff between efficiency and scalability and have complementary advantages. Collectively, these two variants are called as data and model parallel ELMs (DMP-ELMs). The advantages of DMP-ELMs over existing variants are highlighted as follows: (1) They simultaneously utilize data and model parallel techniques to improve the parallelism of ELM. (2) They have better scalability to support larger data and models due to that they have addressed the memory and computational bottlenecks appearing in existing variants. Extensive experiments conducted on four large-scale datasets show that our proposed algorithms have good scalability and achieve almost ideal speedup. To the best of our knowledge, it is the first time to successfully train a large ELM model with 50,000 hidden neurons on the mnist8m dataset with 8.1 million samples and 784 features.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 320, 3 December 2018, Pages 85-97
نویسندگان
, , , , , ,