In edge-distributed environments, spatiotemporal graphs provide a promising solution for capturing the complex dependencies among nodes and edges necessary for accurate wind speed forecasting. These dependencies involve spatial and temporal interactions that are crucial for modeling dynamic weather patterns. However, challenges, such as effectively maintaining spatial dependency information across spatiotemporal subgraphs, can lead to reduced prediction accuracy. Additionally, managing high communication costs, associated with the need for frequent and intensive data exchanges required for real-time forecasting across distributed nodes, poses significant hurdles. To address these issues, we propose graph coarsening-based cross-subgraph message passing with edge collaboration training mechanism (namely ComPact), a novel approach that simplifies graph structures through graph coarsening while preserving essential spatiotemporal dependencies. This coarsening process minimizes communication overhead and enables effective cross-subgraph message passing, capturing both local and long-range dependencies. ComPact further leverages hierarchical graph learning and structured edge collaboration to integrate global information into local subgraphs, enhancing predictive performance. Experimental validation on large-scale datasets, primarily the WindPower dataset, demonstrates ComPact’s superiority in wind speed forecasting, with up to a 31.82% reduction in Mean Absolute Error (MAE) and 11.8% lower in Mean Absolute Percentage Error (MAPE) compared to federated learning baselines.
- Article type
- Year
- Co-author


Graph data have extensive applications in various domains, including social networks, biological reaction networks, and molecular structures. Graph classification aims to predict the properties of entire graphs, playing a crucial role in many downstream applications. However, existing graph neural network methods require a large amount of labeled data during the training process. In real-world scenarios, the acquisition of labels is extremely costly, resulting in labeled samples typically accounting for only a small portion of all training data, which limits model performance. Current semi-supervised graph classification methods, such as those based on pseudo-labels and knowledge distillation, still face limitations in effectively utilizing unlabeled graph data and mitigating pseudo-label bias issues. To address these challenges, we propose a Semi-supervised graph Contrastive learning based on Associative Memory network and Pseudo-label Similarity (SCoAMPS). SCoAMPS integrates pseudo-labeling techniques with contrastive learning by generating contrastive views through multiple encoders, selecting positive and negative samples using pseudo-label similarity, and defining associative memory network to alleviate pseudo-label bias problems. Experimental results demonstrate that SCoAMPS achieves significant performance improvements on multiple public datasets.