کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
7257546 1472430 2017 17 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays
ترجمه فارسی عنوان
درخواست های رانندگی در رانندگی بسیار اتوماتیک: یک نظرسنجی در مورد شنوایی، ارتعاشی و نمایش بصری
موضوعات مرتبط
علوم انسانی و اجتماعی روانشناسی روان شناسی کاربردی
چکیده انگلیسی
An important research question in the domain of highly automated driving is how to aid drivers in transitions between manual and automated control. Until highly automated cars are available, knowledge on this topic has to be obtained via simulators and self-report questionnaires. Using crowdsourcing, we surveyed 1692 people on auditory, visual, and vibrotactile take-over requests (TORs) in highly automated driving. The survey presented recordings of auditory messages and illustrations of visual and vibrational messages in traffic scenarios of various urgency levels. Multimodal TORs were the most preferred option in high-urgency scenarios. Auditory TORs were the most preferred option in low-urgency scenarios and as a confirmation message that the system is ready to switch from manual to automated mode. For low-urgency scenarios, visual-only TORs were more preferred than vibration-only TORs. Beeps with shorter interpulse intervals were perceived as more urgent, with Stevens' power law yielding an accurate fit to the data. Spoken messages were more accepted than abstract sounds, and the female voice was more preferred than the male voice. Preferences and perceived urgency ratings were similar in middle- and high-income countries. In summary, this international survey showed that people's preferences for TOR types in highly automated driving depend on the urgency of the situation.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Transportation Research Part F: Traffic Psychology and Behaviour - Brought to you by:GAYATRI VIDYA PARISHAD COLLEGE OF ENGINEERING for Women due by 31 Dec 2017
نویسندگان
, , , , ,