The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 3 of 24
Back to Result List

Displays for Productive Non-Driving Related Tasks: Visual Behavior and Its Impact in Conditionally Automated Driving

  • (1) Background: Primary driving tasks are increasingly being handled by vehicle automation so that support for non-driving related tasks (NDRTs) is becoming more and more important. In SAE L3 automation, vehicles can require the driver-passenger to take over driving controls, though. Interfaces for NDRTs must therefore guarantee safe operation and should also support productive work. (2) Method: We conducted a within-subjects driving simulator study (N=53) comparing Heads-Up Displays (HUDs) and Auditory Speech Displays (ASDs) for productive NDRT engagement. In this article, we assess the NDRT displays’ effectiveness by evaluating eye-tracking measures and setting them into relation to workload measures, self-ratings, and NDRT/take-over performance. (3) Results: Our data highlights substantially higher gaze dispersion but more extensive glances on the road center in the auditory condition than the HUD condition during automated driving. We further observed potentially safety-critical glance deviations from the road during take-overs(1) Background: Primary driving tasks are increasingly being handled by vehicle automation so that support for non-driving related tasks (NDRTs) is becoming more and more important. In SAE L3 automation, vehicles can require the driver-passenger to take over driving controls, though. Interfaces for NDRTs must therefore guarantee safe operation and should also support productive work. (2) Method: We conducted a within-subjects driving simulator study (N=53) comparing Heads-Up Displays (HUDs) and Auditory Speech Displays (ASDs) for productive NDRT engagement. In this article, we assess the NDRT displays’ effectiveness by evaluating eye-tracking measures and setting them into relation to workload measures, self-ratings, and NDRT/take-over performance. (3) Results: Our data highlights substantially higher gaze dispersion but more extensive glances on the road center in the auditory condition than the HUD condition during automated driving. We further observed potentially safety-critical glance deviations from the road during take-overs after a HUD was used. These differences are reflected in self-ratings, workload indicators and take-over reaction times, but not in driving performance. (4) Conclusion: NDRT interfaces can influence visual attention even beyond their usage during automated driving. In particular, the HUD has resulted in safety-critical glances during manual driving after take-overs. We found this impacted workload and productivity but not driving performanceshow moreshow less

Download full text files

Export metadata

Additional Services

Search Google Scholar
Metadaten
Author:Clemens SchartmüllerORCiD, Klemens WeiglORCiD, Andreas LöckenORCiD, Philipp WintersbergerORCiD, Marco SteinhauserORCiD, Andreas RienerORCiD
Language:English
Document Type:Article
Year of first Publication:2021
published in (English):Multimodal Technologies and Interaction
Publisher:MDPI
Place of publication:Basel
ISSN:2414-4088
Volume:5
Issue:4
Pages:20
Article Number:21
Review:peer-review
Open Access:ja
Version:published
Tag:automated driving; behavior; conditional automation; displays; eye-tracking; non-driving related tasks; performance; productivity; take-over requests; visual attention
URN:urn:nbn:de:bvb:573-10965
Related Identifier:https://doi.org/10.3390/mti5040021
Faculties / Institutes / Organizations:Fakultät Informatik
CARISSMA Institute of Automated Driving (C-IAD)
Human-Computer Interaction Group (HCIG)
Licence (German):License Logo Creative Commons BY 4.0
Release Date:2022/01/04