Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4335626 | Journal of Neuroscience Methods | 2010 | 10 Pages |
Abstract
Spatial normalization to a common coordinate space, e.g. via the Montreal Neurological Institute (MNI) brain template, is an essential step of analyzing multi-subject functional MRI (fMRI) datasets. The imperfect compensation for individual regional discrepancies during spatial transformation, which could potentially introduce localization errors of the activation foci and/or reduce the detection sensitivity, may be minimized if a template specifically designed for the subjects of a study is applied. In this fMRI study, we proposed and evaluated the use of a study-specific template (SST) based on the mean of individually normalized echo-planar images for group data analysis. A hand flexion and a word generation tasks were performed on young volunteers in experiment 1. Comparing with the MNI template approach, greater t-values of local maxima and activated voxels were detected within volume-of-interests (VOIs) with the SST approach in both tasks. Moreover, the SST approach reduced Euclidean distances between activation foci of individuals and group by 1.52Â mm in motor fMRI and 5.84Â mm in language fMRI. Similar results were obtained with or without spatial smoothing of the echo-planar images. Experiment 2 further examined these two approaches in older adults, in which volumetric differences between subjects are of great concerns. With a working memory task, the SST approach showed greater t-values of local maxima and activated voxels within the VOI of prefrontal gyrus. This study demonstrated that the SST resulted in more focused activation patterns and effectively improved the fMRI sensitivity, which suggested potentials of reducing number of subjects required for group analysis.
Related Topics
Life Sciences
Neuroscience
Neuroscience (General)
Authors
Chih-Mao Huang, Shwu-Hua Lee, Ing-Tsung Hsiao, Wan-Chun Kuan, Yau-Yau Wai, Han-Jung Ko, Yung-Liang Wan, Yuan-Yu Hsu, Ho-Ling Liu,