Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10370220 | Speech Communication | 2005 | 16 Pages |
Abstract
Results were gathered through experiments in which user interactions with a prototype speech-based navigational system were recorded, post-processed, and analyzed for prosodic content. Subjects participated in two sessions, one using a speech-based, displayless interface, and a second using a multimodal interface that included a visual-tactile map display. Results showed strong evidence of significant changes in subjects' prosodic features when using a displayless versus a multimodal navigational interface for all categories of subjects. Insights gained from this work can be used to improve the design of the user interface for such applications. Also, results of this work can be used to refine the selection of acoustic cues used as predictors in prosodic pattern detection algorithms for these types of applications.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Signal Processing
Authors
Julie Baca, Joseph Picone,