DeepSave: saving DNN inference during handovers on the edgeOpen Website

2019 (modified: 13 Nov 2022)SEC 2019Readers: Everyone
Abstract: Recent advances in deep neural networks (DNNs) have substantially improved the accuracy and speed of a variety of intelligent applications, for example, real-time video classification. However, one of the challenges is how to maintain the quality of service during handovers to avoid interruptions. Inspired by the recently developed DNN partition schemes, where the DNN model inference can be partitioned and jointly processed at a mobile device and its connected edge-computing server, we propose DeepSave, a promising solution to save a large portion of consecutive video frames that cannot be handled during handovers1. DeepSave comprises two subschemes: (1) The Frame Choosing Scheme is to determine which frames we should save during a handover, to maximize the number of saved frames while preserving the accuracy of the inferences. (2) The Last Arriving Frame Repartition Scheme, with a provable performance bound, is to handle the last frame before the end of the handover as soon as possible, so that the arriving frames after the handover can be processed as usual without causing congestion. We have built up a real-world prototype and conducted field experiments and extensive simulations, showing that DeepSave can save up to 50.98% frames during handovers, which is much more than the benchmark schemes.
0 Replies

Loading