AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (5.4 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Evolved differential model for sporadic graph time-series prediction

Department of Electrical and Computer Engineering, Stony Brook University, Stony Brook, NY 11794, USA
New York University, New York, NY 10012, USA
Show Author Information

Abstract

Sensing signals of many real-world network systems, such as traffic network or microgrid, could be sparse and irregular in both spatial and temporal domains due to reasons such as cost reduction, noise corruption, or device malfunction. It is a fundamental but challenging problem to model the continuous dynamics of a system from the sporadic observations on the network of nodes, which is generally represented as a graph. In this paper, we propose a deep learning model called Evolved Differential Model (EDM) to model the continuous-time stochastic process from partial observations on graph. Our model incorporates diffusion convolutional network to parameterize continuous-time system dynamics by graph Ordinary Differential Equation (ODE) and graph Stochastic Differential Equation (SDE). The graph ODE is applied to accurately capture the spatial-temporal relation and extract hidden features from the data. The graph SDE can efficiently capture the underlying uncertainty of the network systems. With the recurrent ODE-SDE scheme, EDM can serve as an accurate online predictive model that is effective for either monitoring or analyzing the real-world networked objects. Through extensive experiments on several datasets, we demonstrate that EDM outperforms existing methods in online prediction tasks.

References

[1]
X. Jiang, P. Ji, and S. Li, CensNet: Convolution with edge-node switching in graph neural networks, in Proc. 28th Int. Joint Conf. Artificial Intelligence, Macao, China, 2019, pp. 2656–2662.
[2]
R. Ying, J. You, C. Morris, X. Ren, W. L. Hamilton, and J. Leskovec, Hierarchical graph representation learning with differentiable pooling, in Proc. 32nd Conf. Neural Information Processing Systems, Montréal, Canada, 2018, pp. 4805–4815.
[3]
M. Simonovsky and N. Komodakis, Dynamic edge-conditioned filters in convolutional neural networks on graphs, in Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017, pp. 29–38.
[4]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, in Proc. 2017 IEEE Int. Conf. Computer Vision (ICCV), Venice, Italy, 2017.
[5]
T. Kipf, E. Fetaya, K. C. Wang, M. Welling, and R. Zemel, Neural relational inference for interacting systems, in Proc. 35th Int. Conf. Machine Learning, Stockholm, Sweden, 2018, pp. 2688–2697.
[6]
B. Yu, H. Yin, and Z. Zhu, ST-UNet: A spatio-temporal U-network for graph-structured time series modeling, arXiv preprint arXiv: 1903.05631, 2019.
[7]
B. Yu, M. Li, J. Zhang, and Z. Zhu, 3D graph convolutional networks with temporal graphs: A spatial information free framework for traffic forecasting, arXiv preprint arXiv: 1903.00919, 2019.
[8]
C. Sun, P. Karlsson, J. Wu, J. B. Tenenbaum, and K. Murphy, Predicting the present and future states of multi-agent systems from partially-observed visual data, in Proc. 7th Int. Conf. Learning Representations (ICLR), New Orleans, LA, USA, 2019.
[9]

Z. Cui, L. Lin, Z. Pu, and Y. Wang, Graph Markov network for traffic forecasting with missing data, Transp. Res. Part C Emerg. Technol., vol. 117, p. 102671, 2020.

[10]
A. Sanchez-Gonzalez, V. Bapst, K. Cranmer, and P. W. Battaglia, Hamiltonian graph networks with ODE integrators, arXiv preprint arXiv: 1909.12790, 2019.
[11]
M. Poli, S. Massaroli, J. Park, A. Yamashita, H. Asama, J. Park, M. Poli, S. Massaroli, C. M. Rabideau, J. Park et al., Graph neural ordinary differential equations, arXiv preprint arXiv: 1911.07532, 2019.
[12]
Z. Huang, Y. Sun, and W. Wang, Learning continuous system dynamics from irregularly-sampled partial observations, in Proc. 34th Conf. Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada, 2020, pp. 16177–16187.
[13]
M. Defferrard, X. Bresson, and P. Vandergheynst, Convolutional neural networks on graphs with fast localized spectral filtering, arXiv preprint arXiv: 1606.09375, 2016.
[14]
Y. Li, R. Yu, C. Shahabi, and Y. Liu, Diffusion convolutional recurrent neural network: Data-driven traffic forecasting, arXiv preprint arXiv: 1707.01926, 2017.
[15]
Y. Seo, M. Defferrard, P. Vandergheynst, and X. Bresson, Structured sequence modeling with graph convolutional recurrent networks, in Proc. 25th Int. Conf. Neural Information Processing (ICONIP 2018), Siem Reap, Cambodia, 2018, pp. 362–373.
[16]

F. Manessi, A. Rozza, and M. Manzo, Dynamic graph convolutional networks, Pattern Recognit., vol. 97, p. 107000, 2020.

[17]
B. Yu, H. Yin, and Z. Zhu, Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting, in Proc. 27th Int. Joint Conf. Artificial Intelligence, Stockholm, Sweden, 2018, pp. 3634–3640.
[18]
Z. Diao, X. Wang, D. Zhang, Y. Liu, K. Xie, and S. He, Dynamic spatial-temporal graph convolutional neural networks for traffic forecasting, in Proc. 32nd AAAI Conf. Artificial Intelligence, 31st Innovative Applications of Artificial Intelligence Conf., 9th AAAI Symp. Educational Advances in Artificial Intelligence, Honolulu, HI, USA, 2019, pp. 890–897.
[19]
A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. Schardl, and C. Leiserson, EvolveGCN: evolving graph convolutional networks for dynamic graphs, in Proc. 34th AAAI Conf. Artificial Intelligence (AAAI-20), New York, NY, USA, pp. 5363–5370.
[20]
J. You, R. Ying, X. Ren, W. L. Hamilton, and J. Leskovec, GraphRNN: generating realistic graphs with deep auto-regressive models, arXiv preprint arXiv: 1802.08773, 2018.
[21]
Y. Li, O. Vinyals, C. Dyer, R. Pascanu, and P. Battaglia, Learning deep generative models of graphs, arXiv preprint arXiv: 1803.03324, 2018.
[22]
R. Liao, Y. Li, Y. Song, S. Wang, W. Hamilton, D. K. Duvenaud, R. Urtasun, and R. Zemel, Efficient graph generation with graph recurrent attention networks, in Proc. 33rd Conf. Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019, pp. 4255–4265.
[23]
H. Shrivastava, X. Chen, B. Chen, G. Lan, S. Aluru, H. Liu, and L. Song, Glad: Learning sparse graph recovery, in Proc. 8th Int. Conf. Learning Representations (ICLR), virtual, 2020.
[24]
H. Chu, D. Li, D. Acuna, A. Kar, M. Shugrina, X. Wei, M. Y. Liu, A. Torralba, and S. Fidler, Neural turtle graphics for modeling city road layouts, in Proc. IEEE/CVF Int. Conf. Computer Vision (ICCV), Seoul, Republic of Korea, 2019, pp. 4521–4529.
[25]
Y. Jin and J. F. J′aJ′a, Learning graph-level representations with gated recurrent neural networks, arXiv preprint arXiv: 1805.07683, 2018.
[26]
Y. Li, D. Tarlow, M. Brockschmidt, and R. S. Zemel, Gated graph sequence neural networks, in Proc. 4th Int. Conf. Learning Representations (ICLR), San Juan, Puerto Rico, 2016.
[27]
A. Taheri, K. Gimpel, and T. Berger-Wolf, Learning graph representations with recurrent neural network autoencoders, https://www.kdd.org/kdd2018/files/deep-learning-day/DLDay18_paper_27.pdf, 2018.
[28]
V. N. Ioannidis, A. G. Marques, and G. B. Giannakis, A recurrent graph neural network for multi-relational data, in Proc. ICASSP 2019-2019 IEEE Int. Conf. Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 2019, pp. 8157–8161.
[29]
P. Goyal, N. Kamra, X. He, and Y. Liu, DynGEM: deep embedding method for dynamic graphs, arXiv preprint arXiv: 1805.11273, 2018.
[30]

P. Goyal, S. R. Chhetri, and A. Canedo, dyngraph2vec: Capturing network dynamics using dynamic graph representation learning, Knowl. Based Syst., vol. 187, p. 104816, 2020.

[31]
E. Hajiramezanali, A. Hasanzadeh, N. Duffield, K. R. Narayanan, M. Zhou, and X. Qian, Variational graph recurrent neural networks, arXiv preprint arXiv: 1908.09710, 2019.
[32]
T. Yan, H. Zhang, Z. Li, and Y. Xia, Stochastic graph recurrent neural network, arXiv preprint arXiv: 2009.00538, 2020.
[33]
R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, Neural ordinary differential equations, arXiv preprint arXiv: 1806.07366, 2018.
[34]
Y. Rubanova, T. Q. Chen, and D. K. Duvenaud, Latent ordinary differential equations for irregularly-sampled time series, in Proc. 33rd Conf. Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019, pp. 5321–5331.
[35]
E. De Brouwer, J. Simm, A. Arany, and Y. Moreau, GRUODE- Bayes: Continuous modeling of sporadically-observed time series, in Proc. 33rd Conf. Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019, pp. 7379–7390.
[36]
Z. Fang, Q. Long, G. Song, and K. Xie, Spatial-temporal graph ODE networks for traffic flow forecasting, in Proc. 27th ACM SIGKDD Conf. Knowledge Discovery & Data Mining, virtual, 2021.
[37]
J. Choi, H. Choi, J. Hwang, and N. Park, Graph neural controlled differential equations for traffic forecasting, in Proc. 36th AAAI Conference on Artificial Intelligence (AAAI-22), virtual, 2022, pp. 6367–6374.
[38]
X. Liu, T. Xiao, S. Si, Q. Cao, S. Kumar, and C. J. Hsieh, How does noise help robustness? explanation and exploration under the neural SDE framework, in Proc. IEEE/CVF Conf. Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 2020, pp. 279–287.
[39]
S. Peluchetti and S. Favaro, Infinitely deep neural networks as diffusion processes, in Proc. 23rd Int. Conf. on Artificial Intelligence and Statistics, virtual, 2020, pp. 1126–1136.
[40]
L. Kong, J. Sun, and C. Zhang, SDE-Net: Equipping deep neural network with uncertainty estimates, in Proc. 37th Int. Conf. Machine Learning, virtual, 2020.
[41]
B. Tzen and M. Raginsky, Theoretical guarantees for sampling and inference in generative models with latent diffusions, in Proc. 32nd Annual Conf. Learning Theory, Phoenix, AZ, USA, 2019, pp. 3084–3114.
[42]
B. Tzen and M. Raginsky, Neural stochastic differential equations: Deep latent gaussian models in the diffusion limit, arXiv preprint arXiv: 1905.09883, 2019.
[43]
X. Li, T. K. L. Wong, R. T. Q. Chen, and D. Duvenaud, Scalable gradients for stochastic differential equations, in Proc. 23rd Int. Conf. Artificial Intelligence and Statistics, virtual, 2020, pp. 3870–3882.
[44]
X. Liu, T. Xiao, S. Si, Q. Cao, S. Kumar, and C. J. Hsieh, Neural SDE: Stabilizing neural ODE networks with stochastic noise, arXiv preprint arXiv: 1906.02355, 2019.
[45]
Y. Liu, Y. Xing, X. Yang, X. Wang, J. Shi, D. Jin, and Z. Chen, Learning continuous-time dynamics by stochastic differential networks, arXiv preprint arXiv: 2006.06145, 2020.
[46]
Y. Liu, Y. Xing, X. Yang, X. Wang, J. Shi, D. Jin, Z. Chen, and J. Wu, Continuous-time stochastic differential networks for Irregular time series modeling, in Proc. 28th Int. Conf. Neural Information Processing, Bali, Indonesia, 2021, pp. 343–351,
[47]
Y. Liu, Deep generative models for stochastic modeling of multivariate sequential data, Ph. D. dissertation, State University of New York at Stony Brook, USA, 2021.
[48]
RTDS-Technologies-Inc., Power hardware-in-the loop (phil), https://www.rtds.com/applications/power-hardware-in-the-loop/, 2022.
[49]
D. P. Kingma, J. Ba, and M. M. Hammad, Adam: A method for stochastic optimization, arXiv preprint arXiv: 1412.6980, 2014.
Intelligent and Converged Networks
Pages 237-247
Cite this article:
Xing Y, Wu J, Liu Y, et al. Evolved differential model for sporadic graph time-series prediction. Intelligent and Converged Networks, 2024, 5(3): 237-247. https://doi.org/10.23919/ICN.2024.0017

115

Views

14

Downloads

0

Crossref

0

Scopus

Altmetrics

Received: 17 December 2023
Revised: 01 April 2024
Accepted: 25 April 2024
Published: 30 September 2024
© All articles included in the journal are copyrighted to the ITU and TUP.

This work is available under the CC BY-NC-ND 3.0 IGO license:https://creativecommons.org/licenses/by-nc-nd/3.0/igo/

Return