کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4946655 1439409 2017 8 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Recurrent networks with soft-thresholding nonlinearities for lightweight coding
ترجمه فارسی عنوان
شبکه های مجازی با غیر خطی نرم افزاری برای برنامه نویسی سبک
کلمات کلیدی
کدگذاری ضعیف کارآمد، یادگیری بی نظیر، حافظه کوتاه مدت، نزول شیب پروگزیمال، شبکه های عصبی،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی
A long-standing and influential hypothesis in neural information processing is that early sensory networks adapt themselves to produce efficient codes of afferent inputs. Here, we show how a nonlinear recurrent network provides an optimal solution for the efficient coding of an afferent input and its history. We specifically consider the problem of producing lightweight codes, ones that minimize both ℓ1 and ℓ2 constraints on sparsity and energy, respectively. When embedded in a linear coding paradigm, this problem results in a non-smooth convex optimization problem. We employ a proximal gradient descent technique to develop the solution, showing that the optimal code is realized through a recurrent network endowed with a nonlinear soft thresholding operator. The training of the network connection weights is readily achieved through gradient-based local learning. If such learning is assumed to occur on a slower time-scale than the (faster) recurrent dynamics, then the network as a whole converges to an optimal set of codes and weights via what is, in effect, an alternative minimization procedure. Our results show how the addition of thresholding nonlinearities to a recurrent network may enable the production of lightweight, history-sensitive encoding schemes.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 94, October 2017, Pages 212-219
نویسندگان
, ,