Publication

A Visually Explainable Learning System for Skin Lesion Detection Using Multiscale Input with Attention U-Net

Ho Minh Duy Nguyen, Abraham Ezema, Fabrizio Nunnari, Daniel Sonntag

In: Ute Schmid, Franziska Klügl, Diedrich Wolter (editor). KI 2020: Advances in Artificial Intelligence. German Conference on Artificial Intelligence (KI-2020) 43rd September 21-25 Bamberg Germany Pages 313-319 Lecture Notes in Computer Science (LNCS) 12325 Springer 9/2020.

Abstract

In this work, we propose a new approach to automatically predict the locations of visual dermoscopic attributes for Task 2 of the ISIC 2018 Challenge. Our method is based on the Attention U-Net with multi-scale images as input. We apply a new strategy based on transfer learning, i.e., training the deep network for feature extraction by adapting the weights of the network trained for segmentation. Our tests show that, first, the proposed algorithm is on par or outperforms the best ISIC 2018 architectures (LeHealth and NMN) in the extraction of two visual features. Secondly, it uses only 1/30 of the training parameters; we observed less computation and memory requirements, which are particularly useful for future implementations on mobile devices. Finally, our approach generates visually explainable behaviour with uncertainty estimations to help doctors in diagnosis and treatment decisions.

Projekte

Weitere Links

KI_2020.pdf (pdf, 584 KB)

German Research Center for Artificial Intelligence
Deutsches Forschungszentrum für Künstliche Intelligenz