AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (2.4 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

ComPact: Edge Collaborative Spatiotemporal Graph Learning for Wind Speed Forecasting

Zaigang Gong1Siyu Chen1Qiangsheng Dai2Ying Feng1Jinghui Zhang3( )
State Grid Yangzhou Power Supply Company, Yangzhou 225009, China
State Grid Jiangsu Electric Power Co. Ltd., Nanjing 210000, China
School of Computer Science and Engineering, Southeast University, Nanjing 211189, China
Show Author Information

Abstract

In edge-distributed environments, spatiotemporal graphs provide a promising solution for capturing the complex dependencies among nodes and edges necessary for accurate wind speed forecasting. These dependencies involve spatial and temporal interactions that are crucial for modeling dynamic weather patterns. However, challenges, such as effectively maintaining spatial dependency information across spatiotemporal subgraphs, can lead to reduced prediction accuracy. Additionally, managing high communication costs, associated with the need for frequent and intensive data exchanges required for real-time forecasting across distributed nodes, poses significant hurdles. To address these issues, we propose graph coarsening-based cross-subgraph message passing with edge collaboration training mechanism (namely ComPact), a novel approach that simplifies graph structures through graph coarsening while preserving essential spatiotemporal dependencies. This coarsening process minimizes communication overhead and enables effective cross-subgraph message passing, capturing both local and long-range dependencies. ComPact further leverages hierarchical graph learning and structured edge collaboration to integrate global information into local subgraphs, enhancing predictive performance. Experimental validation on large-scale datasets, primarily the WindPower dataset, demonstrates ComPact’s superiority in wind speed forecasting, with up to a 31.82% reduction in Mean Absolute Error (MAE) and 11.8% lower in Mean Absolute Percentage Error (MAPE) compared to federated learning baselines.

References

[1]
S. A. Alshuwair, Edge computing applications for smart grid and distributed systems, in Proc. 2022 Saudi Arabia Smart Grid (SASG), Riyadh, Saudi Arabia, 2022, pp. 1–7.
[2]

Q. N. Minh, V. H. Nguyen, V. K. Quy, L. A. Ngoc, A. Chehri, and G. Jeon, Edge computing for IoT-enabled smart grid: The future of energy, Energies, vol. 15, no. 17, p. 6140, 2022.

[3]
Z. Suo, Y. Lu, H. Hong, S. Zou, J. Song, H. Lu, J. Wang, and Y. Zhang, An advanced IoT based edge computing forecasting framework, in Proc. 2nd Int. Joint Conf. Energy, Electrical and Power Engineering, Singapore, 2023, pp. 768–774.
[4]

J. Liu, H. Gao, C. Yang, C. Shi, T. Yang, H. Cheng, Q. Xie, X. Wang, and D. Wang, Heterogeneous spatio-temporal graph contrastive learning for point-of-interest recommendation, Tsinghua Science and Technology, vol. 30, no. 1, pp. 186–197, 2025.

[5]

B. Zhou, J. Liu, S. Cui, and Y. Zhao, A large-scale spatio-temporal multimodal fusion framework for traffic prediction, Big Data Mining and Analytics, vol. 7, no. 3, pp. 621–636, 2024.

[6]
L. Liu, J. Chen, S. Zhu, J. Zhang, and C. Zhang, Federated graph learning with cross-subgraph missing links recovery, in Proc. 3rd Int. Conf. Information Technology, Big Data and Artificial Intelligence (ICIBA), Chongqing, China, 2023, pp. 1223–1228.
[7]

Z. Zhang, B. Guo, W. Sun, Y. Liu, and Z. Yu, Cross-FCL: Toward a cross-edge federated continual learning framework in mobile edge computing systems, IEEE Trans. Mob. Comput., vol. 23, no. 1, pp. 313–326, 2024.

[8]

C. Chen, M. Herrera, G. Zheng, L. Xia, Z. Ling, and J. Wang, Cross-edge orchestration of serverless functions with probabilistic caching, IEEE Trans. Serv. Comput., vol. 17, no. 5, pp. 2139–2150, 2024.

[9]
G. E. Box, G. M. Jenkins, G. C. Reinsel, and G. M. Ljung, Time Series Analysis: Forecasting and Control. Hoboken, NJ, USA: John Wiley & Sons, 2015.
[10]

A. Dempster, F. Petitjean, and G. I. Webb, Rocket: Exceptionally fast and accurate time series classification using random convolutional kernels, Data Min. Knowledge Discov., vol. 34, no. 5, pp. 1454–1495, 2020.

[11]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, in Proc. 2017 Int. Conf. Learning Representations, https://dblp.uni-trier.de/search?q=Semi-supervised%20classification%20with%20graph%20convolutional%20networks, 2017.
[12]
Y. Li, R. Yu, C. Shahabi, and Y. Liu, Diffusion convolutional recurrent neural network: Data-driven traffic forecasting, in Proc. 2018 Int. Conf. Learning Representations, https://dblp.uni-trier.de/search?q=Diffusion+convolutional+recurrent+neural+network%3A+Data-driven+traffic+forecasting, 2018.
[13]
B. Yu, H. Yin, and Z. Zhu, Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting, in Proc. 27th Int. Joint Conf. Artificial Intelligence, Stockholm, Sweden, 2018, pp. 3634–3640.
[14]
S. Guo, Y. Lin, N. Feng, C. Song, and H. Wan, Attention based spatial-temporal graph convolutional networks for traffic flow forecasting, in Proc. 33rd AAAI Conf. Artificial Intelligence, Honolulu, HI, USA, 2019, pp. 922–929.
[15]
Z. Wu, S. Pan, G. Long, J. Jiang, and C. Zhang, Graph wavenet for deep spatial-temporal graph modeling, in Proc. 28th Int. Joint Conf. Artificial Intelligence, Macao, China, 2019, pp. 1907–1913.
[16]
Z. Wu, S. Pan, G. Long, J. Jiang, X. Chang, and C. Zhang, Connecting the dots: Multivariate time series forecasting with graph neural networks, in Proc. 26th ACM SIGKDD Int. Conf. Knowledge Discovery & Data Mining, New York, NY, USA, 2020, pp. 753–763.
[17]
C. Shang, J. Chen, and J. Bi, Discrete graph structure learning for forecasting multiple time series, in Proc. 2021 Int. Conf. Learning Representations, https: //par.nsf.gov/biblio/10253603-discrete-graph-structure-learning-forecasting-multiple-time-series, 2021.
[18]
L. Bai, L. Yao, C. Li, X. Wang, and C. Wang, Adaptive graph convolutional recurrent network for traffic forecasting, in Proc. 34th Int. Conf. Neural Information Processing Systems, Red Hook, NY, USA, 2020, pp. 17804–17815.
[19]
Q. Sun, J. Li, H. Peng, J. Wu, X. Fu, C. Ji, and S. Y. Philip, Graph structure learning with variational information bottleneck, in Proc. 36th AAAI Conf. Artificial Intelligence, Palo Alto, CA, USA, 2022, pp. 4165–4174.
[20]
X. Gao, W. Hu, and Z. Guo, Exploring structure-adaptive graph learning for robust semi-supervised classification, in Proc. 2020 IEEE Int. Conf. Multimedia and Expo (ICME), London, UK, 2020, pp. 1–6.
[21]
W. Jin, Y. Ma, X. Liu, X. Tang, S. Wang, and J. Tang, Graph structure learning for robust graph neural networks, in Proc. 26th ACM SIGKDD Int. Conf. Knowledge Discovery & Data Mining, New York, NY, USA, 2020, pp. 66–74.
[22]
X. Zheng, Y. Wang, Y. Liu, M. Li, M. Zhang, D. Jin, P. S. Yu, and S. Pan, Graph neural networks for graphs with heterophily: A survey, arXiv preprint arXiv: 2202.07082, 2022.
[23]

D. Zhang, X. Huang, Z. Liu, J. Zhou, Z. Hu, X. Song, Z. Ge, L. Wang, Z. Zhang, and Y. Qi, AGL: A scalable system for industrial-purpose graph machine learning, Proc. VLDB Endow., vol. 13, no. 12, pp. 3125–3137, 2020.

[24]
S. Gandhi and A. P. Iyer, P3: Distributed deep graph learning at scale, in Proc. 15th USENIX Symp. Operating Systems Design and Implementation (OSDI 21), Virtual Event, 2021, pp. 551–568.
[25]
J. Thorpe, Y. Qiao, J. Eyolfson, S. Teng, G. Hu, Z. Jia, J. Wei, K. Vora, R. Netravali, M. Kim, et al., Dorylus: Affordable, scalable, and accurate GNN training with distributed CPU servers and serverless threads, in Proc. 15th USENIX Symp. Operating Systems Design and Implementation (OSDI 21), Virtual Event, 2021, pp. 495–514.
[26]
B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. Y. Arcas, Communication-efficient learning of deep networks from decentralized data, in Proc. 20th Int. Conf. Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 2017, pp. 1273–1282.
[27]
Z. Wang, W. Kuang, Y. Xie, L. Yao, Y. Li, B. Ding, and J. Zhou, FederatedScope-GNN: Towards a unified, comprehensive and efficient package for federated graph learning, in Proc. 28th ACM SIGKDD Conf. Knowledge Discovery and Data Mining, New York, NY, USA, 2022, pp. 4110–4120.
[28]
C. Meng, S. Rambhatla, and Y. Liu, Cross-node federated graph neural network for spatio-temporal data modeling, in Proc. 27th ACM SIGKDD Conf. Knowledge Discovery & Data Mining, New York, NY, USA, 2021, pp. 1202–1211.
[29]
H. Xie, J. Ma, L. Xiong, and C. Yang, Federated graph classification over non-ⅡD graphs, in Proc. 35th Int. Conf. Neural Information Processing Systems, Red Hook, NY, USA, 2021, pp. 18839–18852.
[30]
B. Wang, A. Li, M. Pang, H. Li, and Y. Chen, GraphFL: A federated learning framework for semi-supervised node classification on graphs, in Proc. 2022 IEEE Int. Conf. Data Mining (ICDM), Orlando, FL, USA, 2022, pp. 498–507.
[31]

L. Zheng, J. Zhou, C. Chen, B. Wu, L. Wang, and B. Zhang, ASFGNN: Automated separated-federated graph neural network, Peer-to-Peer Netw. Appl., vol. 14, no. 3, pp. 1692–1704, 2021.

[32]
Y. Pei, R. Mao, Y. Liu, C. Chen, S. Xu, F. Qiang, and B. E. Tech, Decentralized federated graph neural networks, in Proc. 2021 Int. Workshop on Federated and Transfer Learning for Data Sparsity and Confidentiality in Conjunction with IJCAI, https://federated-learning.org/fl-ijcai-2021/, 2021.
[33]

M. Jiang, T. Jung, R. Karl, and T. Zhao, Federated dynamic graph neural networks with secure aggregation for video-based distributed surveillance, ACM Trans. Intell. Syst. Technol. (TIST), vol. 13, no. 4, p. 56, 2022.

[34]

Q. Lai, J. Tian, W. Wang, and X. Hu, Spatial-temporal attention graph convolution network on edge cloud for traffic flow prediction, IEEE Trans. Intell. Transp. Syst., vol. 24, no. 4, pp. 4565–4576, 2023.

[35]
F. Chen, G. Long, Z. Wu, T. Zhou, and J. Jiang, Personalized federated learning with a graph, in Proc. 31st Int. Joint Conf. Artificial Intelligence, Vienna, Austria, 2022, pp. 2575–2582.
[36]
C. Cai, D. Wang, and Y. Wang, Graph coarsening with neural networks, in Proc. Int. Conf. Learning Representations, https://iclr.cc/media/iclr-2021/Slides/2646.pdf, 2021.
[37]
Z. Huang, S. Zhang, C. Xi, T. Liu, and M. Zhou, Scaling up graph neural networks via graph coarsening, in Proc. 27th ACM SIGKDD Conf. Knowledge Discovery & Data Mining, New York, NY, USA, 2021, pp. 675–684.
[38]
K. Guo, Y. Hu, Y. Sun, S. Qian, J. Gao, and B. Yin, Hierarchical graph convolution network for traffic forecasting, in Proc. 35th AAAI Conf. Artificial Intelligence, Palo Alto, CA, USA, 2021, pp. 151–159.
[39]
W. Zhu, T. Wen, G. Song, X. Ma, and L. Wang, Hierarchical transformer for scalable graph learning, in Proc. 32nd Int. Joint Conf. Artificial Intelligence, Macao, China, 2023, pp. 4702–4710.
[40]
V. Kalofolias, How to learn a graph from smooth signals, in Proc. 19th Int. Conf. Artificial Intelligence and Statistics, Cadiz, Spain, 2016, pp. 920–929.
[41]
K. Zhou, X. Huang, D. Zha, R. Chen, L. Li, S. H. Choi, and X. Hu, Dirichlet energy constrained learning for deep graph neural networks, in Proc. 35th Int. Conf. Neural Information Processing Systems, Red Hook, NY, USA, 2021, p. 1671.
[42]
M. Kumar, A. Sharma, S. Saxena, and S. Kumar, Featured graph coarsening with similarity guarantees, in Proc. 40th Int. Conf. Machine Learning, Honolulu, HI, USA, 2023, p. 740.
[43]
K. Nguyen, H. Nong, V. Nguyen, N. Ho, S. Osher, and T. Nguyen, Revisiting over-smoothing and over-squashing using ollivier-ricci curvature, in Proc. 40th Int. Conf. Machine Learning, Honolulu, HI, USA, 2023, p. 1080.
[44]
D. Shi, A. Han, L. Lin, Y. Guo, and J. Gao, Exposition on over-squashing problem on GNNs: Current methods, benchmarks and challenges, arXiv preprint arXiv: 2311.07073, 2023.
Tsinghua Science and Technology
Pages 2320-2341
Cite this article:
Gong Z, Chen S, Dai Q, et al. ComPact: Edge Collaborative Spatiotemporal Graph Learning for Wind Speed Forecasting. Tsinghua Science and Technology, 2025, 30(5): 2320-2341. https://doi.org/10.26599/TST.2024.9010261

74

Views

8

Downloads

0

Crossref

0

Web of Science

0

Scopus

0

CSCD

Altmetrics

Received: 02 December 2024
Revised: 15 December 2024
Accepted: 24 December 2024
Published: 29 April 2025
© The Author(s) 2025.

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return