کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
6260436 1613079 2016 6 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Minimum description length model selection in associative learning
ترجمه فارسی عنوان
حداقل مدل انتخاب طول مدل در یادگیری وابسته
موضوعات مرتبط
علوم زیستی و بیوفناوری علم عصب شناسی علوم اعصاب رفتاری
چکیده انگلیسی


- A non-associative model of associative learning.
- Focus on the encoding of intervals into memory.
- Partly inspired by results showing interval memory in Purkinje cells.
- Founded on minimum description length principle for stochastic model selection.
- Explains cue competition, response timing and the parametric invariances.

Two principles of information theory - maximum entropy and minimum description length - motivate a computational model of associative learning that explains assignment of credit, response timing, and the parametric invariances. The maximum entropy principle gives rise to two distributions - the exponential and the evitable Gaussian - which naturally lend themselves to inference involving each of the two fundamental classes of predictors - enduring states and point events. These distributions are the 'atoms' from which more complex representations are built. The representation that is 'learned' is determined by the principle of minimum-description-length. In this theory, learning is a synonym for data compression: The representation of its experience that the animal learns is the representation that best allows the data of experience to be compressed.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Current Opinion in Behavioral Sciences - Volume 11, October 2016, Pages 8-13
نویسندگان
, ,