[1]
Y. LeCun, Y. Bengio, and G. Hinton, Deep learning, Nature, vol. 521, no. 7553, pp. 436-444, 2015.
[2]
T. Mikolov, G. Corrado, K. Chen, and J. Dean, Efficient estimation of word representations in vector space, in Proc. of the International Conference on Learning Representations, Scottsdale, AZ, USA, 2013.
[3]
J. Atwood and D. Towsley, Diffusion-convolutional neural networks, in Proc. of the 30th International Conference on Neural Information Processing Systems, Barcelona, Spain, 2016, pp. 2001-2009.
[4]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, presented at the 5th International Conference on Learning Representations, Toulon, France, 2017.
[5]
Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and P. S. Yu, A comprehensive survey on graph neural networks, IEEE Transactions on Neural Networks and Learning Systems, vol. 32, no. 1, pp. 4-24, 2021.
[6]
J. Lee, I. Lee, and J. Kang, Self-attention graph pooling, in Proc. the 36th International Conference on Machine Learning, Long Beach, CA, USA, 2019, pp. 3734-3743.
[8]
K. He, X. Zhang, S. Ren, and J. Sun, Deep residual learning for image recognition, in Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 2016, pp. 770-778.
[9]
W. L. Hamilton, R. Ying, and J. Leskovec, Inductive representation learning on large graphs, in Proc. of the 31th NIPS. Conference on Neural Information Processing Systems, Long Beach, CA, USA, 2017, pp. 1025-1035.
[10]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, Graph attention networks, presented at the 6th International Conference on Learning Representations, Vancouver, Canada, 2018.
[11]
G. Li, M. Müller, A. K. Thabet, and B. Ghanem, Inductive DeepGCNs: Can GCNs go as deep as CNNs? in Proc. of the 2019 IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 2017, pp. 9266-9275.
[12]
Y. Rong, W. Huang, T. Xu, and J. Huang, DropEdge: Towards deep graph convolutional networks on node classification, presented at the 8th International Conference on Learning Representations, Addis Ababa, Ethiopia, 2020.
[13]
R. Ying, J. You, C. Morris, X. Ren, W. L. Hamilton, and J. Leskovec, Hierarchical graph representation learning with differentiable pooling, in Proc. of the 32th International Conference on Neural Information Processing Systems, Montréal, Canada, 2018, pp. 4805-4815.
[14]
H. Gao and S. Ji, Graph U-nets, in Proc. of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 2019, pp. 2083-2092.
[15]
E. Ranjan, S. Sanyal, and P. Talukdar, ASAP: Adaptive structure aware pooling for learning hierarchical graph representations, in Proc. of the 34th AAAI Conference on Artificial Intelligence, New York, NY, USA, 2020, pp. 5470-5477.
[16]
D. Chen, Y. Lin, W. Li, P. Li, J. Zhou, and X. Sun, Measuring and relieving the over-smoothing problem for graph neural networks from the topological view, in Proc. 34th AAAI Conference on Artificial Intelligence, New York, NY, USA, 2020, pp. 3438-3445.
[17]
M. Fey and J. E. Lenssen, Fast graph representation learning with PyTorch geometric, presented at the 7th International Conference on Learning Representations, New Orleans, LA, USA, 2019.
[18]
P. D. Dobson and A. J. Doig, Distinguishing enzyme structures from non-enzymes without alignments, Journal of Molecular Biology, vol. 330, no. 4, pp. 771-783, 2003.
[19]
K. M. Abrahams, C. S. Ong, S. Schönauer, S. V. N. Vishwanathan, A. J. Smola, and H. P. Kriegel, Protein function prediction via graph kernels, Bioinformatics, vol. 21, no. 1, pp. 47-56, 2005.
[20]
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt, Weisfeiler-lehman graph kernels, The Journal of Machine Learning Research, vol. 12, no. 3, pp. 2539-2561, 2011.
[21]
N. Wale, I. A. Watson, and G. Karypis, Comparison of descriptor spaces for chemical compound retrieval and classification, Knowledge and Information Systems, vol. 14, no. 3, pp. 347-375, 2008.
[22]
O. Vinyals, S. Bengio, and M. Kudlur, Order matters: Sequence to sequence for sets, presented at the 4th International Conference on Learning Representations, San Juan, Puerto Rico, 2016.
[23]
M. Zhang, Z. Cui, M. Neumann, and Y. Chen, An end-to-end deep learning architecture for graph classification, in Proc. 32th AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2018, pp. 1127-1137.
[24]
Z. Ma, J. Xuan, Y. G. Wang, M. Li, and P. Liò, Path integral based convolution and pooling for graph neural networks, presented at the 33th Conference on Neural Information Processing Systems, Vancouver, Canada, 2020.
[25]
A. Micheli, Neural network for graphs: A contextual constructive approach, IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 498-511, 2009.
[26]
M. Chen, Z. Wei, Z. Huang, B. Ding, and Y. Li, Simple and deep graph convolutional networks, in Proc. of the 37th International Conference on Machine Learning, New Orleans, LA, USA, 2020, pp. 1725-1735.
[27]
E. Chien, J. Peng, P. Li, and O. Milenkovic, Adaptive universal generalized PageRank graph neural network, presented at the 8th International Conference on Learning Representations, Addis Ababa, Ethiopia, 2020.
[28]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, How powerful are graph neural networks? presented at the 7th ICLR International Conference on Learning Representations, New Orleans, LA, USA, 2019.