Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma

Tirtha Chanda, Katja Hauser, Sarah Hobelsberger, Tabea-Clara Bucher, Carina Nogueira Garcia, Christoph Wies, Harald Kittler, Philipp Tschandl, Cristian Navarrete-Dechent, Sebastian Podlipnik, Emmanouil Chousakos, Iva Crnaric, Jovana Majstorovic, Linda Alhajwan, Tanya Foreman, Sandra Peternel, Sergei Sarap, Irem Oezdemir, Raymond L. Barnhill, Mar Llamas-VelascoGabriela Poch, Soeren Korsing, Wiebke Sondermann, Frank Friedrich Gellrich, Markus V. Heppt, Michael Erdmann, Sebastian Haferkamp, Konstantin Drexler, Matthias Goebeler, Bastian Schilling, Jochen S. Utikal, Kamran Ghoreschi, Stefan Froehling, Eva Krieghoff-Henning, Reader Study Consortium, Titus J. Brinker

Publikation: Beitrag in FachzeitschriftOriginalarbeit (Zeitschrift)Begutachtung

2 Quellenangaben (Web of Science)

Abstract

Artificial intelligence (AI) systems have been shown to help dermatologists diagnose melanoma more accurately, however they lack transparency, hindering user acceptance. Explainable AI (XAI) methods can help to increase transparency, yet often lack precise, domain-specific explanations. Moreover, the impact of XAI methods on dermatologists' decisions has not yet been evaluated. Building upon previous research, we introduce an XAI system that provides precise and domain-specific explanations alongside its differential diagnoses of melanomas and nevi. Through a three-phase study, we assess its impact on dermatologists' diagnostic accuracy, diagnostic confidence, and trust in the XAI-support. Our results show strong alignment between XAI and dermatologist explanations. We also show that dermatologists' confidence in their diagnoses, and their trust in the support system significantly increase with XAI compared to conventional AI. This study highlights dermatologists' willingness to adopt such XAI systems, promoting future use in the clinic.Artificial intelligence has become popular as a cancer classification tool, but there is distrust of such systems due to their lack of transparency. Here, the authors develop an explainable AI system which produces text- and region-based explanations alongside its classifications which was assessed using clinicians' diagnostic accuracy, diagnostic confidence, and their trust in the system.
OriginalspracheEnglisch
Seiten (von - bis)524
Seitenumfang17
FachzeitschriftNature Communications
Jahrgang15
Ausgabenummer1
DOIs
PublikationsstatusVeröffentlicht - 15 Jan. 2024

Fingerprint

Untersuchen Sie die Forschungsthemen von „Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren