Abstract: In the context of disaster response and recovery within 6th Generation (6G) networks, achieving both low-latency and energy-efficient communication under compromised infrastructure remains a critical challenge. This paper introduces a novel framework that integrates a solar-powered High-Altitude Platform (HAP) with multiple Unmanned Aerial Vehicles (UAVs) equipped with Reconfigurable Intelligent Surfaces (RISs), significantly enhancing disaster response capabilities. A hybrid approach combining game theory and multi-agent reinforcement learning (MARL) is employed to optimize UAV energy management, RIS control, and the offloading data rates of ground devices (GDs). Specifically, game theory is used to determine optimal task offloading decisions, balancing energy consumption, latency, and computational efficiency, while MARL dynamically guides UAV trajectories and RIS configurations to maintain robust communication links. A key innovation is the RIS ON/OFF mechanism, which conserves energy by switching OFF RISs when not needed, allowing UAVs to recharge during inactive periods and extending operational lifetimes. The proposed framework also demonstrates superior performance in optimizing offloading data rates and minimizing task offloading costs, ensuring efficient resource utilization. Extensive simulations validate the effectiveness of this approach, showing significant improvements in energy efficiency, data processing performance, and overall network reliability compared to traditional methods. These advancements contribute to more reliable and energy-efficient disaster response operations within 6G networks.
Loading