Exploring Algorithmic Explainability: Generating Explainable AI Insights for Personalized Clinical Decision Support Focused on Cannabis Intoxication in Young Adults

Tongze Zhang, Tammy Chung, Anind Dey, Sang Won Bae

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

As an increasing number of states adopt more permissive cannabis regulations, the necessity of gaining a comprehensive understanding of cannabis's effects on young adults has grown exponentially, driven by its escalating prevalence of use. By leveraging popular eXplainable Artificial Intelligence (XAI) techniques such as SHAP (SHapley Additive exPlanations), rule-based explanations, intrinsically interpretable models, and counterfactual explanations, we undertake an exploratory but in-depth examination of the impact of cannabis use on individual behavioral patterns and physiological states. This study explores the possibility of facilitating algorithmic decision-making by combining interpretable artificial intelligence (XAI) techniques with sensor data, with the aim of providing researchers and clinicians with personalized analyses of cannabis intoxication behavior. SHAP analyzes the importance and quantifies the impact of specific factors such as environmental noise or heart rate, enabling clinicians to pinpoint influential behaviors and environmental conditions. SkopeRules simplify the understanding of cannabis use for a specific activity or environmental use. Decision trees provide a clear visualization of how factors interact to influence cannabis consumption. Counterfactual models help identify key changes in behaviors or conditions that may alter cannabis use outcomes, to guide effective individualized inter-vention strategies. This multidimensional analytical approach not only unveils changes in behavioral and physiological states after cannabis use, such as frequent fluctuations in activity states, nontraditional sleep patterns, and specific use habits at different times and places, but also highlights the significance of individual differences in responses to cannabis use. These insights carry profound implications for clinicians seeking to gain a deeper understanding of the diverse needs of their patients and for tailoring precisely targeted intervention strategies. Furthermore, our findings highlight the pivotal role that XAI technologies could play in enhancing the transparency and interpretability of Clinical Decision Support Systems (CDSS), with a particular focus on substance misuse treatment. This research significantly contributes to ongoing initiatives aimed at advancing clinical practices that aim to prevent and reduce cannabis-related harms to health, positioning XAI as a supportive tool for clinicians and researchers alike.

Original languageEnglish
Title of host publication2024 International Conference on Activity and Behavior Computing, ABC 2024
ISBN (Electronic)9798350375503
DOIs
StatePublished - 2024
Event2024 International Conference on Activity and Behavior Computing, ABC 2024 - Oita/Kitakyushu, Japan
Duration: 29 May 202431 May 2024

Publication series

Name2024 International Conference on Activity and Behavior Computing, ABC 2024

Conference

Conference2024 International Conference on Activity and Behavior Computing, ABC 2024
Country/TerritoryJapan
CityOita/Kitakyushu
Period29/05/2431/05/24

Keywords

  • Algorithmic Decisions
  • Algorithmic Explainability
  • Cannabis Intoxication
  • Cannabis-Intoxicated Behaviors
  • CDSS
  • Clinical Decision Support Systems
  • Explainable AI
  • Explainable Artificial Intelligence
  • Passive Sensing
  • Personalized CDSS
  • Personalized Intervention
  • Transparancy
  • XAI

Fingerprint

Dive into the research topics of 'Exploring Algorithmic Explainability: Generating Explainable AI Insights for Personalized Clinical Decision Support Focused on Cannabis Intoxication in Young Adults'. Together they form a unique fingerprint.

Cite this