Article ID Journal Published Year Pages File Type
920191 Acta Psychologica 2011 12 Pages PDF
Abstract

Even though human perceptual development relies on combining multiple modalities, most categorization studies so far have focused on the visual modality. To better understand the mechanisms underlying multisensory categorization, we analyzed visual and haptic perceptual spaces and compared them with human categorization behavior. As stimuli we used a three-dimensional object space of complex, parametrically-defined objects. First, we gathered similarity ratings for all objects and analyzed the perceptual spaces of both modalities using multidimensional scaling analysis. Next, we performed three different categorization tasks which are representative of every-day learning scenarios: in a fully unconstrained task, objects were freely categorized, in a semi-constrained task, exactly three groups had to be created, whereas in a constrained task, participants received three prototype objects and had to assign all other objects accordingly. We found that the haptic modality was on par with the visual modality both in recovering the topology of the physical space and in solving the categorization tasks. We also found that within-category similarity was consistently higher than across-category similarity for all categorization tasks and thus show how perceptual spaces based on similarity can explain visual and haptic object categorization. Our results suggest that both modalities employ similar processes in forming categories of complex objects.

► We investigate similarity rating and categorization in visual and haptic processing. ► We use complex, parametrically-defined objects in our experiments. ► The perceptual spaces for both tasks are highly similar for vision and touch. ► Haptic processing can compete with visual processing even in complex tasks.

Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, , ,