• Treffer 1 von 1
Zurück zur Trefferliste

Interfaces for Explanations in Human-AI Interaction: Proposing a Design Evaluation Approach

  • Explanations in Human-AI Interaction are communicated to human decision makers through interfaces. Yet, it is not clear what consequences the exact representation of such explanations as part of decision support systems (DSS) and working on machine learning (ML) models has on human decision making. We observe a need for research methods that allow for measuring the effect different eXplainable AI (XAI) interface designs have on people’s decision making. In this paper, we argue for adopting research approaches from decision theory for HCI research on XAI interface design. We outline how we used estimation tasks in human-grounded design research in order to introduce a method and measurement for collecting evidence on XAI interface effects. To this end, we investigated representations of LIME explanations in an estimation task online study as proof-of-concept for our proposal.

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar
Metadaten
Verfasserangaben:Henrik Mucha, Sebastian Robert, Ruediger Breitschwerdt, Michael Fellmann
URL:https://doi.org/10.1145/3411763.3451759
Titel des übergeordneten Werkes (Englisch):Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA
Dokumentart:Konferenzveröffentlichung
Sprache:Englisch
Erscheinungsjahr:2021
Freies Schlagwort / Tag:Human-AI Interaction
Seitenzahl:6
Erste Seite:327