Abstract: Edge computing brings several challenges when it comes to data movement. First, moving large data from edge devices to the server is likely to waste bandwidth. Second, complex data patterns (e.g., traffic cameras) on devices require flexible handling. An ideal approach is to move code to data instead. However, since only a small portion of code is required, moving the executable as well as their libraries to the devices can be an overkill. While loading code on demand from remote such as NFS can be a stopgap, but on the other hand leads to low efficiency for irregular access patterns. This article presents StreamSys, a lightweight executable delivery system that loads code on demand by redirecting the local disk IO to the server through optimized network IO. We employ a Markov-based prefetch mechanism on the server side. It learns the access pattern of code and predicts the block sequence for the client to reduce the network round trip. Meanwhile, server-side StreamSys asynchronously prereads the block sequence from the disk to conceal disk IO latency beforehand. Evaluation shows that the latency of StreamSys is up to 71.4% lower than the native Linux file system based on SD card and up to 62% lower than NFS in wired environments.
Loading