کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
6027021 1580908 2014 7 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Efficient gradient computation for dynamical models
ترجمه فارسی عنوان
محاسبات شیب کارآمد برای مدل های دینامیکی
کلمات کلیدی
موضوعات مرتبط
علوم زیستی و بیوفناوری علم عصب شناسی علوم اعصاب شناختی
چکیده انگلیسی
Data assimilation is a fundamental issue that arises across many scales in neuroscience - ranging from the study of single neurons using single electrode recordings to the interaction of thousands of neurons using fMRI. Data assimilation involves inverting a generative model that can not only explain observed data but also generate predictions. Typically, the model is inverted or fitted using conventional tools of (convex) optimization that invariably extremise some functional - norms, minimum descriptive length, variational free energy, etc. Generally, optimisation rests on evaluating the local gradients of the functional to be optimized. In this paper, we compare three different gradient estimation techniques that could be used for extremising any functional in time - (i) finite differences, (ii) forward sensitivities and a method based on (iii) the adjoint of the dynamical system. We demonstrate that the first-order gradients of a dynamical system, linear or non-linear, can be computed most efficiently using the adjoint method. This is particularly true for systems where the number of parameters is greater than the number of states. For such systems, integrating several sensitivity equations - as required with forward sensitivities - proves to be most expensive, while finite-difference approximations have an intermediate efficiency. In the context of neuroimaging, adjoint based inversion of dynamical causal models (DCMs) can, in principle, enable the study of models with large numbers of nodes and parameters.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: NeuroImage - Volume 98, September 2014, Pages 521-527
نویسندگان
, , ,