Dynamic Charging Strategy Optimization for UAV-Assisted Wireless Rechargeable Sensor Networks Based on Deep Q-Network
Abstract: The development of wireless energy transmission technology has significantly propelled the advancement of wireless rechargeable sensor networks (WRSNs). Energy constraint is one of the most critical challenges in application of WRSNs. Integrating unmanned aerial vehicle (UAV) with wireless energy transmission technology has emerged as a promising approach to overcome the energy constraint problem in WRSNs, leveraging the advantages of UAV, such as flexibility and maneuverability. In this article, we consider the system of WRSN assisted by UAV and mobile utility vehicle (MUV), where the UAV serves as a mobile charger for replenishing energy of sensors and the MUV serves as a mobile base station for replacing the battery of UAV with insufficient energy. In the system, we focus on minimizing the death time of sensors and optimizing the energy consumption of UAV. To address this problem, a multiobjective deep $Q$ -network (DQN) algorithm is employed, where the UAV makes online charging scheduling decisions based on real-time network status and utilizes experience replay for optimization. Experimental results demonstrate that the proposed algorithm significantly reduces the sensors’ death time and effectively decreases the energy consumption of UAV. Specially, the performance of proposed algorithm outperforms the three other classical algorithms: 1) genetic algorithm; 2) greedy algorithm; and 3) $Q$ -learning algorithm.
External IDs:dblp:journals/iotj/LiuZLCHCC24
Loading