Skip to main content Skip to main navigation


Explainable Case‐Based Reasoning: A Survey

Jakob Michael Schönborn; Rosina O. Weber; David W. Aha; Jörg Cassens; Klaus-Dieter Althoff
In: P. Madumal; S. Tulli; R. Weber; D. Aha (Hrsg.). Proceedings for the Explainable Agency in AI Workshop at the 35th AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence (AAAI-2021), February 2-9, The Internet, USA, AAAI, 2021.


Various literature surveys state and confirm a rapid increase in research on explainable artificial intelligence (XAI) in recent years. One possible motivation for this change are legal regulations, including the general data protection regulation (GDPR) but also similar regulations outside of Europe. Another possible reason is the decreasing trust in machine learning systems since both their algorithms and they models they include are often opaque. The desire to retrieve an explanation for a given decision reaches back to the era of expert systems in the 1980s. Decisions made by experts often rely on their stored experiences, yet most XAI approaches cannot provide explanations based on specific experiences because they do not retain them. In contrast, explainable case-based reasoning (XCBR) approaches can provide such explanations, and thus is of interest to XAI researchers. We present a taxonomy of XCBR approaches by categorizing and presenting current methodologies and implementations based on an extensive literature review. This taxonomy can be used by XAI researchers and CBR researchers who are explicitly interested in the generation and use of explanations.

Weitere Links