کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4970089 1450026 2017 10 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Winner takes all hashing for speeding up the training of neural networks in large class problems
ترجمه فارسی عنوان
برنده هر هشیش را برای سرعت بخشیدن به آموزش شبکه های عصبی در کلاس های بزرگ می گیرد
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر چشم انداز کامپیوتر و تشخیص الگو
چکیده انگلیسی
This paper proposes speeding up of convolutional neural networks using Winner Takes All (WTA) hashing. More specifically, WTA hash is used to identify relevant units and only these are computed, effectively ignoring the rest of the units. We show that the proposed method reduces the computational cost of forward and backward propagation for large fully connected layers. This allows us to train classification layers with a large number of units without the associated time penalty. We present different experiments on a dataset with 21K classes to gauge the effectiveness of this proposal. We concretely show that only a small amount of computation is required to train a classification layer. Then we measure and showcase the ability of WTA in identifying the required computation. Furthermore we compare this approach to the baseline and demonstrate a 6 fold speed up during training without compromising on the performance.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Pattern Recognition Letters - Volume 93, 1 July 2017, Pages 38-47
نویسندگان
, , ,