Abstract: Edge artificial intelligence (AI) has emerged as a promising paradigm catering to overwhelming explosions of smart applications, by offloading the computation-intensive deep neural network (DNN) inference to an edge network for processing. The surging of edge AI brings new vigor and vitality to shape the prospect of smart transportation. However, when considering the cooperation between heterogeneous edge devices and the operation precedence between DNN tasks, it is still challenging to decompose a DNN across multiple edge devices in an edge network with a general topology to minimize DNN inference delay. In this article, we devise a polynomial-time optimal solution to the DNN inference offloading problem for smart roadside applications, in which the roadside edge network is usually organized with chain topology. Specifically, the DNN inference offloading problem for the roadside edge network with chain topology is transformed into an equivalent graph optimization problem. Theoretical analysis and extensive evaluations validate the performance of the proposed solution in minimizing the total inference delay.
Loading