[1]
X. Jiang, P. Ji, and S. Li, CensNet: Convolution with edge-node switching in graph neural networks, in Proc. 28th Int. Joint Conf. Artificial Intelligence, Macao, China, 2019, pp. 2656–2662.
[2]
R. Ying, J. You, C. Morris, X. Ren, W. L. Hamilton, and J. Leskovec, Hierarchical graph representation learning with differentiable pooling, in Proc. 32nd Conf. Neural Information Processing Systems, Montréal, Canada, 2018, pp. 4805–4815.
[3]
M. Simonovsky and N. Komodakis, Dynamic edge-conditioned filters in convolutional neural networks on graphs, in Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017, pp. 29–38.
[4]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, in Proc. 2017 IEEE Int. Conf. Computer Vision (ICCV), Venice, Italy, 2017.
[5]
T. Kipf, E. Fetaya, K. C. Wang, M. Welling, and R. Zemel, Neural relational inference for interacting systems, in Proc. 35th Int. Conf. Machine Learning, Stockholm, Sweden, 2018, pp. 2688–2697.
[6]
B. Yu, H. Yin, and Z. Zhu, ST-UNet: A spatio-temporal U-network for graph-structured time series modeling, arXiv preprint arXiv: 1903.05631, 2019.
[7]
B. Yu, M. Li, J. Zhang, and Z. Zhu, 3D graph convolutional networks with temporal graphs: A spatial information free framework for traffic forecasting, arXiv preprint arXiv: 1903.00919, 2019.
[8]
C. Sun, P. Karlsson, J. Wu, J. B. Tenenbaum, and K. Murphy, Predicting the present and future states of multi-agent systems from partially-observed visual data, in Proc. 7th Int. Conf. Learning Representations (ICLR), New Orleans, LA, USA, 2019.
[10]
A. Sanchez-Gonzalez, V. Bapst, K. Cranmer, and P. W. Battaglia, Hamiltonian graph networks with ODE integrators, arXiv preprint arXiv: 1909.12790, 2019.
[11]
M. Poli, S. Massaroli, J. Park, A. Yamashita, H. Asama, J. Park, M. Poli, S. Massaroli, C. M. Rabideau, J. Park et al., Graph neural ordinary differential equations, arXiv preprint arXiv: 1911.07532, 2019.
[12]
Z. Huang, Y. Sun, and W. Wang, Learning continuous system dynamics from irregularly-sampled partial observations, in Proc. 34th Conf. Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada, 2020, pp. 16177–16187.
[13]
M. Defferrard, X. Bresson, and P. Vandergheynst, Convolutional neural networks on graphs with fast localized spectral filtering, arXiv preprint arXiv: 1606.09375, 2016.
[14]
Y. Li, R. Yu, C. Shahabi, and Y. Liu, Diffusion convolutional recurrent neural network: Data-driven traffic forecasting, arXiv preprint arXiv: 1707.01926, 2017.
[15]
Y. Seo, M. Defferrard, P. Vandergheynst, and X. Bresson, Structured sequence modeling with graph convolutional recurrent networks, in Proc. 25th Int. Conf. Neural Information Processing (ICONIP 2018), Siem Reap, Cambodia, 2018, pp. 362–373.
[17]
B. Yu, H. Yin, and Z. Zhu, Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting, in Proc. 27th Int. Joint Conf. Artificial Intelligence, Stockholm, Sweden, 2018, pp. 3634–3640.
[18]
Z. Diao, X. Wang, D. Zhang, Y. Liu, K. Xie, and S. He, Dynamic spatial-temporal graph convolutional neural networks for traffic forecasting, in Proc. 32nd AAAI Conf. Artificial Intelligence, 31st Innovative Applications of Artificial Intelligence Conf., 9th AAAI Symp. Educational Advances in Artificial Intelligence, Honolulu, HI, USA, 2019, pp. 890–897.
[19]
A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. Schardl, and C. Leiserson, EvolveGCN: evolving graph convolutional networks for dynamic graphs, in Proc. 34th AAAI Conf. Artificial Intelligence (AAAI-20), New York, NY, USA, pp. 5363–5370.
[20]
J. You, R. Ying, X. Ren, W. L. Hamilton, and J. Leskovec, GraphRNN: generating realistic graphs with deep auto-regressive models, arXiv preprint arXiv: 1802.08773, 2018.
[21]
Y. Li, O. Vinyals, C. Dyer, R. Pascanu, and P. Battaglia, Learning deep generative models of graphs, arXiv preprint arXiv: 1803.03324, 2018.
[22]
R. Liao, Y. Li, Y. Song, S. Wang, W. Hamilton, D. K. Duvenaud, R. Urtasun, and R. Zemel, Efficient graph generation with graph recurrent attention networks, in Proc. 33rd Conf. Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019, pp. 4255–4265.
[23]
H. Shrivastava, X. Chen, B. Chen, G. Lan, S. Aluru, H. Liu, and L. Song, Glad: Learning sparse graph recovery, in Proc. 8th Int. Conf. Learning Representations (ICLR), virtual, 2020.
[24]
H. Chu, D. Li, D. Acuna, A. Kar, M. Shugrina, X. Wei, M. Y. Liu, A. Torralba, and S. Fidler, Neural turtle graphics for modeling city road layouts, in Proc. IEEE/CVF Int. Conf. Computer Vision (ICCV), Seoul, Republic of Korea, 2019, pp. 4521–4529.
[25]
Y. Jin and J. F. J′aJ′a, Learning graph-level representations with gated recurrent neural networks, arXiv preprint arXiv: 1805.07683, 2018.
[26]
Y. Li, D. Tarlow, M. Brockschmidt, and R. S. Zemel, Gated graph sequence neural networks, in Proc. 4th Int. Conf. Learning Representations (ICLR), San Juan, Puerto Rico, 2016.
[28]
V. N. Ioannidis, A. G. Marques, and G. B. Giannakis, A recurrent graph neural network for multi-relational data, in Proc. ICASSP 2019-2019 IEEE Int. Conf. Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 2019, pp. 8157–8161.
[29]
P. Goyal, N. Kamra, X. He, and Y. Liu, DynGEM: deep embedding method for dynamic graphs, arXiv preprint arXiv: 1805.11273, 2018.
[31]
E. Hajiramezanali, A. Hasanzadeh, N. Duffield, K. R. Narayanan, M. Zhou, and X. Qian, Variational graph recurrent neural networks, arXiv preprint arXiv: 1908.09710, 2019.
[32]
T. Yan, H. Zhang, Z. Li, and Y. Xia, Stochastic graph recurrent neural network, arXiv preprint arXiv: 2009.00538, 2020.
[33]
R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, Neural ordinary differential equations, arXiv preprint arXiv: 1806.07366, 2018.
[34]
Y. Rubanova, T. Q. Chen, and D. K. Duvenaud, Latent ordinary differential equations for irregularly-sampled time series, in Proc. 33rd Conf. Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019, pp. 5321–5331.
[35]
E. De Brouwer, J. Simm, A. Arany, and Y. Moreau, GRUODE- Bayes: Continuous modeling of sporadically-observed time series, in Proc. 33rd Conf. Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019, pp. 7379–7390.
[36]
Z. Fang, Q. Long, G. Song, and K. Xie, Spatial-temporal graph ODE networks for traffic flow forecasting, in Proc. 27th ACM SIGKDD Conf. Knowledge Discovery & Data Mining, virtual, 2021.
[37]
J. Choi, H. Choi, J. Hwang, and N. Park, Graph neural controlled differential equations for traffic forecasting, in Proc. 36th AAAI Conference on Artificial Intelligence (AAAI-22), virtual, 2022, pp. 6367–6374.
[38]
X. Liu, T. Xiao, S. Si, Q. Cao, S. Kumar, and C. J. Hsieh, How does noise help robustness? explanation and exploration under the neural SDE framework, in Proc. IEEE/CVF Conf. Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 2020, pp. 279–287.
[39]
S. Peluchetti and S. Favaro, Infinitely deep neural networks as diffusion processes, in Proc. 23rd Int. Conf. on Artificial Intelligence and Statistics, virtual, 2020, pp. 1126–1136.
[40]
L. Kong, J. Sun, and C. Zhang, SDE-Net: Equipping deep neural network with uncertainty estimates, in Proc. 37th Int. Conf. Machine Learning, virtual, 2020.
[41]
B. Tzen and M. Raginsky, Theoretical guarantees for sampling and inference in generative models with latent diffusions, in Proc. 32nd Annual Conf. Learning Theory, Phoenix, AZ, USA, 2019, pp. 3084–3114.
[42]
B. Tzen and M. Raginsky, Neural stochastic differential equations: Deep latent gaussian models in the diffusion limit, arXiv preprint arXiv: 1905.09883, 2019.
[43]
X. Li, T. K. L. Wong, R. T. Q. Chen, and D. Duvenaud, Scalable gradients for stochastic differential equations, in Proc. 23rd Int. Conf. Artificial Intelligence and Statistics, virtual, 2020, pp. 3870–3882.
[44]
X. Liu, T. Xiao, S. Si, Q. Cao, S. Kumar, and C. J. Hsieh, Neural SDE: Stabilizing neural ODE networks with stochastic noise, arXiv preprint arXiv: 1906.02355, 2019.
[45]
Y. Liu, Y. Xing, X. Yang, X. Wang, J. Shi, D. Jin, and Z. Chen, Learning continuous-time dynamics by stochastic differential networks, arXiv preprint arXiv: 2006.06145, 2020.
[46]
Y. Liu, Y. Xing, X. Yang, X. Wang, J. Shi, D. Jin, Z. Chen, and J. Wu, Continuous-time stochastic differential networks for Irregular time series modeling, in Proc. 28th Int. Conf. Neural Information Processing, Bali, Indonesia, 2021, pp. 343–351,
[47]
Y. Liu, Deep generative models for stochastic modeling of multivariate sequential data, Ph. D. dissertation, State University of New York at Stony Brook, USA, 2021.
[49]
D. P. Kingma, J. Ba, and M. M. Hammad, Adam: A method for stochastic optimization, arXiv preprint arXiv: 1412.6980, 2014.