Explainable AI for Trustworthy Clinical Decision Support: A Case-Based Reasoning System for Nursing Assistants
Abstract: The increasing complexity of patient data and the need for reliable, transparent decision-making in healthcare drive the demand for advanced AI systems. This paper introduces a Clinical Decision Support System (CDSS) integrating Explainable AI (XAI) with Retrieval-Augmented Generation (RAG), designed to offer transparent, real-time, and evidence-based recommendations. HIPAA-compliant and grounded in historical case studies, the system facilitates trustworthy decision-making in high-pressure environments for nursing assistants. Although preliminary, simulations indicate the system’s potential to reduce cognitive load and enhance decision accuracy, with future clinical trials planned for comprehensive validation. It ensures HIPAA compliance while utilizing real-time patient data and historical case studies to support transparent and reliable clinical decisions. The system provides real-time, mobile-friendly recommendations based on predictive analytics and case-based reasoning, ensuring that nursing assistants can make informed, transparent decisions even in high-pressure scenarios. Future work will validate the system through simulations and clinical trials to assess its impact on decision accuracy and cognitive load reduction. Designed specifically for nursing assistants, the system provides real-time, explainable recommendations and retrieves relevant historical cases and clinical guidelines to support decision-making. While this work is preliminary and has not been deployed in clinical settings, theoretical analysis suggests that the system has the potential to improve decision accuracy and reduce cognitive load. A roadmap for future validation and clinical trials is outlined to demonstrate how the proposed system can be integrated into healthcare workflows.
Loading