Explainable Edge AI Framework for IoD-Assisted Aerial Surveillance in Extreme Scenarios

Published: 01 Jan 2025, Last Modified: 15 Apr 2025IEEE Internet Things J. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Drones are sophisticated machines that can hover over extreme locations, conduct aerial surveillance, collect surveillance data, and disseminate it to the distributed edge for processing and analysis. The distributed edge deploys advanced artificial intelligence (AI) models to detect any unwarranted activity or object based on surveillance data. However, these lightweight and low-power unmanned aerial vehicles (UAVs) may experience faults due to unprecedented workload when deployed in extreme surveillance domains. In this article, we have designed an AI framework to detect any safety concerns with drones deployed for aerial surveillance in extreme locations based on real-time drone critical parameters. We also propose a MapReduce-based object recognition and classification module to process large-scale images captured by drones efficiently. However, conventional AI systems behave like black box systems, leading to a lack of trust and transparency. Thus, we convert the traditional framework of AI into an explainable edge AI framework using Shapley additive explanations (SHAPs) that opens Pandora’s black box. The experimental results show the effectiveness of the proposed framework in detecting drone safety concerns through explainable health status tracking alongside ensuring an effective object detection mechanism.
Loading