Article ID Journal Published Year Pages File Type
507489 Computers & Geosciences 2013 9 Pages PDF
Abstract

•Multimodal map interaction (haptic & audio feedback) is supported.•No pre-built virtual environments are used. The map is generated “on-the-fly”.•The exploration of any map area around the world is supported.•The generated map is accurate, as its construction is based on OpenStreetMap data.•The tool provides access to the blind to inaccessible (until now) information.

The use of spatial (geographic) information is becoming ever more central and pervasive in today’s internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map’s presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
, , ,