Explainable artifcial intelligence for spectroscopy data: a review

in: Pflügers Archiv-European Journal of Physiology (2025)
Contreras, Jhonatan; Bocklitz, Thomas W.
Explainable artifcial intelligence (XAI) has gained signifcant attention in various domains, including natural and medical image analysis. However, its application in spectroscopy remains relatively unexplored. This systematic review aims to fll this gap by providing a comprehensive overview of the current landscape of XAI in spectroscopy and identifying potential benefts and challenges associated with its implementation. Following the PRISMA guideline 2020, we conducted a systematic search across major journal databases, resulting in 259 initial search results. After removing duplicates and applying inclusion and exclusion criteria, 21 scientifc studies were included in this review. Notably, most of the studies focused on using XAI methods for spectral data analysis, emphasizing identifying signifcant spectral bands rather than specifc intensity peaks. Among the most utilized AI techniques were SHapley Additive exPlanations (SHAP), masking methods inspired by Local Interpretable Model-agnostic Explanations (LIME), and Class Activation Mapping (CAM). These methods were favored due to their model-agnostic nature and ease of use, enabling interpretable explanations without modifying the original models. Future research should propose new methods and explore the adaptation of other XAI employed in other domains to better suit the unique characteristics of spectroscopic data.

Third party cookies & scripts

This site uses cookies. For optimal performance, smooth social media and promotional use, it is recommended that you agree to third party cookies and scripts. This may involve sharing information about your use of the third-party social media, advertising and analytics website.
For more information, see privacy policy and imprint.
Which cookies & scripts and the associated processing of your personal data do you agree with?

You can change your preferences anytime by visiting privacy policy.