Abstract: Owing to the explosive growth of the Internet of Things (IoT), there have been vast volumes of sensor-generated time series data in various locations. A lot of network usage occurs through these locations to process the sensor-generated data. To reduce network usage, the point of data processing gradually shifts from the central cloud to servers at the edge of the Internet, called edge servers. To cover the paradigm shift, there has been a challenging issue in training a neural network model, which aggregates the data generated by different locations and causes a massive amount of transmitted data across the network. Federated learning successfully resolves this issue by exchanging model parameters in the edge server through the central cloud. Meanwhile, Spatio-Temporal Graph Neural Networks (STGNNs) have gained much attention for analyzing time series data transmitted from several locations in IoT environments. Despite the increasing number of STGNN models that have been examined, the integration of STGNNs and federated learning remains underexplored. This paper presents FedSTGNN, a framework that seamlessly adapts arbitrary STGNN models to edge computing environments. The proposed method partitions the spatio-temporal graph with a mathematical definition and then trains each partitioned data with an arbitrary STGNN model on edge servers separately. Then, it aggregates all the model parameters in the central cloud with dramatically reduced network usage. Through experiments, we demonstrate that the FedSTGNN framework could train the model with a reduced amount of network communication by utilizing the advantages of edge computing and a slight loss in accuracy.
Loading