Abstract: Unmanned Aerial Vehicle (UAV) assisted Mobile Edge Computing (MEC) systems provide substantial benefits
for task offloading and communication services, especially in situations where traditional communication
infrastructure is unavailable. Current research emphasizes maintaining communication quality while minimizing total energy consumption and optimizing UAV flight trajectories. However, several issues remain:
First, the energy consumption objective function lacks comprehensiveness, neglecting the impact of UAV
flight energy consumption; second, an effective Deep Reinforcement Learning (DRL) algorithm has not
been employed to address the non-convexity of the objective function; third, there is insufficient discussion
regarding the practical significance of the proposed approach. To address these issues, this paper formulates
an objective function aimed at minimizing MEC energy consumption by considering task offloading decisions,
communication delays, computational energy consumption, and UAV flight energy consumption. We propose
a Population Diversity-based Particle Swarm Optimization-Double Delay Deep Deterministic Policy Gradient
(PDPSO-TD3) algorithm to find the optimal solution, enhance UAV flight trajectories through optimized
offloading decisions, ensure efficient communication, and minimize the total energy consumption of the
MEC system. Furthermore, we discuss the practical applicability of PDPSO-TD3 in detail and present the
proposed scheme. Experimental results demonstrate that compared to the Deep Deterministic Policy Gradient
(DDPG) algorithm, for transmission delay, MEC energy consumption, UAV flight energy consumption, and User
Equipments (UEs) access rate metrics. The proposed PDPSO-TD3 algorithm can improvement the performance
by about 14.3%, 10.1%, 6.1%, and 3.3%, respectively
Loading