Journal Home > Volume 27 , Issue 4

In academia and industries, graph neural networks (GNNs) have emerged as a powerful approach to graph data processing ranging from node classification and link prediction tasks to graph clustering tasks. GNN models are usually handcrafted. However, building handcrafted GNN models is difficult and requires expert experience because GNN model components are complex and sensitive to variations. The complexity of GNN model components has brought significant challenges to the existing efficiencies of GNNs. Hence, many studies have focused on building automated machine learning frameworks to search for the best GNN models for targeted tasks. In this work, we provide a comprehensive review of automatic GNN model building frameworks to summarize the status of the field to facilitate future progress. We categorize the components of automatic GNN model building frameworks into three dimensions according to the challenges of building them. After reviewing the representative works for each dimension, we discuss promising future research directions in this rapidly growing field.


menu
Abstract
Full text
Outline
About this article

Graph Neural Architecture Search: A Survey

Show Author's information Babatounde Moctard OlouladeJianliang Gao( )Jiamin ChenTengfei LyuRaeed Al-Sabri
School of Computer Science and Engineering, Central South University, Changsha 410083, China

Abstract

In academia and industries, graph neural networks (GNNs) have emerged as a powerful approach to graph data processing ranging from node classification and link prediction tasks to graph clustering tasks. GNN models are usually handcrafted. However, building handcrafted GNN models is difficult and requires expert experience because GNN model components are complex and sensitive to variations. The complexity of GNN model components has brought significant challenges to the existing efficiencies of GNNs. Hence, many studies have focused on building automated machine learning frameworks to search for the best GNN models for targeted tasks. In this work, we provide a comprehensive review of automatic GNN model building frameworks to summarize the status of the field to facilitate future progress. We categorize the components of automatic GNN model building frameworks into three dimensions according to the challenges of building them. After reviewing the representative works for each dimension, we discuss promising future research directions in this rapidly growing field.

Keywords: automated machine learning, graph neural network, neural architecture search, geometric deep learning

References(81)

[1]
J. C. Cheng, Y. L. Li, J. L. Wang, L. Yu, and S. J. Wang, Exploiting effective facial patches for robust gender recognition, Tsinghua Sci. Technol., vol. 24, no. 3, pp. 333-345, 2019.
[2]
F. K. Gustafsson, M. Danelljan, and T. B. Schon, Evaluating scalable Bayesian deep learning methods for robust computer vision, presented at the 2020 IEEE/CVF Conf. Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 2020, pp. 318-319.
DOI
[3]
B. Liu, S. J. Tang, X. G. Sun, Q. Y. Chen, J. X. Cao, J. Z. Luo, and S. S. Zhao, Context-aware social media user sentiment analysis, Tsinghua Science and Technology, vol. 25, no. 4, pp. 528-541, 2020.
[4]
Z. Q. Peng, H. Z. Song, B. H. Kang, O. B. Moctard, M. He, and X. H. Zheng, Automatic textual knowledge extraction based on paragraph constitutive relations, presented at the 6th Int. Conf. Systems and Informatics (ICSAI), Shanghai, China, 2019, pp. 527-532.
DOI
[5]
M. Al-Ayyoub, A. Nuseir, K. Alsmearat, Y. Jararweh, and B. Gupta, Deep learning for Arabic NLP: A survey, J. Comput. Sci., vol. 26, pp. 522-531, 2018.
[6]
M. Farooq, F. Hussain, N. K. Baloch, F. R. Raja, H. Yu, and Y. B. Zikria, Impact of feature selection algorithm on speech emotion recognition using deep convolutional neural network, Sensors, vol. 20, no. 21, p. 6008, 2020.
[7]
Y. Bengio, Y. LeCun, and U. de Montréal, Scaling learning algorithms towards AI, in Large-Scale Kernel Machines, L. Bottou, O. Chapelle, D. DeCoste, and J. Weston, eds. Cambridge, MA, USA: MIT Press, 2007, pp. 1-41.
[8]
T. Ching, D. S. Himmelstein, B. K. Beaulieu-Jones, A. A. Kalinin, B. T. Do, G. P. Way, E. Ferrero, P. M. Agapow, M. Zietz, M. M. Hoffman, et al., Opportunities and obstacles for deep learning in biology and medicine, J. Roy. Soc. Interface, vol. 15, no. 141, p. 20170387, 2018.
[9]
Y. Gurovich, Y. Hanani, O. Bar, G. Nadav, N. Fleischer, D. Gelbman, L. Basel-Salmon, P. M. Krawitz, S. B. Kamphausen, M. Zenker, et al., Identifying facial phenotypes of genetic disorders using deep learning, Nat. Med., vol. 25, no. 1, pp. 60-64, 2019.
[10]
W. J. Liu, G. Q. Wu, F. J. Ren, and X. Kang, DFF-ResNet: An insect pest recognition model based on residual networks, Big Data Mining and Analytics, vol. 3, no. 4, pp. 300-310, 2020.
[11]
S. H. Wang, C. C. Liu, X. Gao, H. T. Qu, and W. Xu, Session-based fraud detection in online e-commerce transactions using recurrent neural networks, presented at the European Conf. Machine Learning and Knowledge Discovery in Databases, Skopje, Macedonia, 2017, pp. 241-252.
DOI
[12]
A. S. Sheikh, R. Guigourès, E. Koriagin, Y. K. Ho, R. Shirvany, R. Vollgraf, and U. Bergmann, A deep learning system for predicting size and fit in fashion e-commerce, in Proc. 13th ACM Conf. Recommender Systems, Copenhagen, Denmark, 2019, pp. 110-118.
DOI
[13]
Z. Yang, Y. S. Zhang, B. H. Guo, B. Y. Zhao, and Y. F. Dai, Deepcredit: Exploiting user cickstream for loan risk prediction in p2p lending, in Proc. 12th Int. Conf. Web and Social Media, ICWSM 2018, Stanford, CA, USA, 2018, pp. 444-453.
[14]
Y. J. Wang, Y. Yao, H. H. Tong, F. Xu, and J. Lu, A brief review of network embedding, Big Mining and Analytics, vol. 2, no. 1, pp. 35-47, 2019.
[15]
Z. Q. Liu, C. C. Chen, L. F. Li, J. Zhou, X. L. Li, L. Song, and Y. Qi, GeniePath: Graph neural networks with adaptive receptive paths, Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 4424-4431, 2019.
[16]
Z. Q. Zhang, J. Y. Cai, Y. D. Zhang, and J. Wang, Learning hierarchy-aware knowledge graph embeddings for link prediction, Proc. AAAI Conf. Artif. Intell., vol. 34, no. 3, pp. 3065-3072, 2020.
[17]
W. W. Gu, F. Gao, R. Q. Li, and J. Zhang, Learning universal network representation via link prediction by graph convolutional neural network, Journal of Social Computing, vol. 2, no. 1, pp. 43-51, 2021.
[18]
K. H. Zhang, T. P. Li, S. W. Shen, B. Liu, J. Chen, and Q. S. Liu, Adaptive graph convolutional network with attention graph clustering for co-saliency detection, presented at the 2020 IEEE/CVF Conf. Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 2020, pp. 9047-9056.
DOI
[19]
L. X. Xie, X. Chen, K. F. Bi, L. H. Wei, Y. H. Xu, Z. S. Chen, L. F. Wang, A. Xiao, J. L. Chang, X. P. Zhang, et al., Weight-sharing neural architecture search: A battle to shrink the optimization gap, arXiv preprint arXiv: 2008.01475, 2020.
[20]
M. Nunes and G. L. Pappa, Neural architecture search in graph neural networks, presented at the 9th Brazilian Conf. Intelligent Systems, Rio Grande, Brazil, 2020, pp. 302-317.
DOI
[21]
Z. W. Zhang, X. Wang, and W. W. Zhu, Automated machine learning on graphs: A survey, arXiv preprint arXiv: 2103.00742, 2021.
[22]
K. X. Zhou, Q. Q. Song, X. Huang, and X. Hu, Auto-GNN: Neural architecture search of graph neural networks, arXiv preprint arXiv: 1909.03184, 2019.
[23]
M. Shi, D. A. Wilson, X. Q. Zhu, Y. Huang, Y. Zhuang, J. X. Liu, and Y. F. Tang, Evolutionary architecture search for graph neural networks, arXiv preprint arXiv: 2009.10199, 2020.
[24]
J. X. You, Z. T. Ying, and J. Leskovec, Design space for graph neural networks, in Proc. 34th Conf. Neural Information Processing Systems, Vancouver, Canada, 2020, pp. 17009-17021.
[25]
A. Sperduti and A. Starita, Supervised neural networks for the classification of structures, IEEE Trans. Neural Netw., vol. 8, no. 3, pp. 714-735, 1997.
[26]
M. Gori, G. Monfardini, and F. Scarselli, A new model for learning in graph domains, in Proc. IEEE Int. Joint Conf. Neural Networks, Montreal, Canada, 2005, pp. 729-734.
[27]
F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, The graph neural network model, IEEE Trans. Neural Netw., vol. 20, no. 1, pp. 61-80, 2009.
[28]
Z. H. Wu, S. R. Pan, F. W. Chen, G. D. Long, C. Q. Zhang, and P. S. Yu, A comprehensive survey on graph neural networks, IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 1, pp. 4-24, 2021.
[29]
Y. L. Zhang, B. Wu, Y. Liu, and J. N. Lv, Local community detection based on network motifs, Tsinghua Science and Technology, vol. 24, no. 6, pp. 716-727, 2019.
[30]
F. M. Bianchi, D. Grattarola, and C. Alippi, Spectral clustering with graph neural networks for graph pooling, in Proc. 37th Int. Conf. Machine Learning, Vienna, Austria, 2020, pp. 874-883.
[31]
A. Micheli, Neural network for graphs: A contextual constructive approach, IEEE Trans. Neural Netw., vol. 20, no. 3, pp. 498-511, 2009.
[32]
H. Peng, H. F. Wang, B. W. Du, M. Z. A. Bhuiyan, H. Y. Ma, J. W. Liu, L. H. Wang, Z. Y. Yang, L. F. Du, S. Z. Wang, et al., Spatial temporal incidence dynamic graph neural networks for traffic flow forecasting, Inform. Sci., vol. 521, pp. 277-290, 2020.
[33]
M. Niepert, M. Ahmed, and K. Kutzkov, Learning convolutional neural networks for graphs, in Proc. 33rd Int. Conf. Machine Learning, New York, NY, USA, 2016, pp. 2014-2023.
[34]
B. Mohar, The laplacian spectrum of graphs, in Graph Theory, Combinatorics, and Applications, Y. Alavi, G. Chartrand, O. R. Oellermann, and A. J. Schwenk, eds. New York, NY, USA: John Wiley and Sons, Inc., 1991, pp. 871-898.
[35]
J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, Neural message passing for quantum chemistry, in Proc. 34th Int. Conf. Machine Learning, Sydney, Australia, 2017, pp. 1263-1272.
[36]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, presented at the 5th Int. Conf. Learning Representations, Toulon, France, 2017, pp. 1263-1272.
[37]
K. Xu, W. H. Hu, J. Leskovec, and S. Jegelka, How powerful are graph neural networks? presented at the 7th Int. Conf. Learning Representations, New Orleans, LA, USA, 2019, pp. 826-842.
[38]
P. Velikovi, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, Graph attention networks, presented at the 6th Int. Conf. Learning Representations, Vancouver, Canada, 2018, pp. 10903-10914.
[39]
E. Ranjan, S. Sanyal, and P. Talukdar, ASAP: Adaptive structure aware pooling for learning hierarchical graph representations, Proc. AAAI Conf. Artif. Intell., vol. 34, no. 4, pp. 5470-5477, 2020.
[40]
X. J. Qi, R. J. Liao, J. Y. Jia, S. Fidler, and R. Urtasun, 3D graph neural networks for RGBD semantic segmentation, presented at the 2017 IEEE Int. Conf. Computer Vision (ICCV), Venice, Italy, 2017, pp. 5209-5218.
DOI
[41]
C. Huang, H. C. Xu, Y. Xu, P. Dai, L. H. Xia, M. Y. Lu, L. F. Bo, H. Xing, X. P. Lai, and Y. F. Ye, Knowledge-aware coupled graph neural network for social recommendation, Proc. AAAI Conf. Artif. Intell., vol. 35, no. 5, pp. 4115-4122, 2021.
[42]
W. P. Song, Z. P. Xiao, Y. F. Wang, L. Charlin, M. Zhang, and J. Tang, Session-based social recommendation via dynamic graph attention networks, in Proc. 12th ACM Int. Conf. Web Search and Data Mining, Melbourne, Australia, 2019, pp. 555-563.
DOI
[43]
J. Zhang, Y. F. Wang, Z. Y. Yuan, and Q. Jin, Personalized real-time movie recommendation system: Practical prototype and evaluation, Tsinghua Science and Technology, vol. 25, no. 2, pp. 180-191, 2020.
[44]
Y. Gao, H. Yang, P. Zhang, C. Zhou, and Y. Hu, GraphNAS: Graph neural architecture search with reinforcement learning, arXiv preprint arXiv: 1904.09981, 2019.
[45]
G. C. Williams and D. C. Williams, Natural selection of individually harmful social adaptations among sibs with special reference to social insects, Evolution, vol. 11, no. 1, pp. 32-39, 1957.
[46]
Y. Gao, H. Yang, P. Zhang, C. Zhou, and Y. Hu, Graph neural architecture search, in Proc. 29th Int. Joint Conf. Artificial Intelligence, Yokohama, Japan, 2020, pp. 1403-1409.
DOI
[47]
H. Zhao, L. N. Wei, and Q. M. Yao, Simplifying architecture search for graph neural network, arXiv preprint arXiv: 2008.11652, 2020.
[48]
Y. M. Li and I. King, Autograph: Automated graph neuralnetwork, presented at the 27 th Int. Conf. Neural Information Processing, Bangkok, Thailand, 2020, pp. 189-201.
DOI
[49]
J. Bergstra, R. Bardenet, Y. Bengio, and B. Kégl, Algorithms for hyper-parameter optimization, in Proc. 24th Int. Conf. Neural Information Processing Systems, Granada, Spain, 2011, pp. 2546-2554.
[50]
M. Yoon, T. Gervet, B. Hooi, and C. Faloutsos, Autonomous graph mining algorithm search with best speed/accuracy trade-off, presented at the 2020 IEEE Int. Conf. Data Mining (ICDM), Sorrento, Italy, 2020, pp. 751-760.
DOI
[51]
L. Page, S. Brin, R. Motwani, and T. Winograd, The PageRank Citation Ranking: Bringing Order to the Web. Palo Alto, CA, USA: Stanford InfoLab, 1999.
[52]
H. X. Liu, K. Simonyan, and Y. M. Yang, DARTS: Differentiable architecture search, presented at the 7th Int. Conf. Learning Representations, New Orleans, LA, USA, 2019, pp. 9055-9067.
[53]
S. R. Xie, H. H. Zheng, C. X. Liu, and L. Lin, SNAS: Stochastic neural architecture search, presented at the 7th Int. Conf. Learning Representations, New Orleans, LA, USA, 2019.
[54]
Y. R. Zhao, D. Wang, X. T. Gao, R. D. Mullins, P. Liò, and M. Jamnik, Probabilistic dual network architecture search on graphs, arXiv preprint arXiv: 2003.09676, 2020.
[55]
E. Jang, S. X. Gu, and B. Poole, Categorical reparameterization with gumbel-softmax, presented at the 5th Int. Conf. Learning Representations, Toulon, France, 2017, pp. 1144-1156.
[56]
H. Zhao, Q. M. Yao, and W. W. Tu, Search to aggregate neighborhood for graph neural network, presented at the IEEE 37th Int. Conf. Data Engineering (ICDE), Chania, Greece, 2021, pp. 552-563.
DOI
[57]
Y. H. Ding, Q. M. Yao, and T. Zhang, Propagation model search for graph neural networks, arXiv preprint arXiv: 2010.03250, 2020.
[58]
K. Tu, J. X. Ma, P. Cui, J. Pei, and W. W. Zhu, AutoNE: Hyperparameter optimization for massive network embedding, in Proc. 25th ACM SIGKDD Int. Conf. Knowledge Discovery & Data Mining, Anchorage, AK, USA, 2019, pp. 216-225.
DOI
[59]
Y. X. Li, Z. A. Wen, Y. H. Wang, and C. Xu, One-shot graph neural architecture search with dynamic search space, Proc. AAAI Conf. Artif. Intell., vol. 35, no. 10, pp. 8510-8517, 2021.
[60]
S. F. Cai, L. Li, J. C. Deng, B. C. Zhang, Z. J. Zha, L. Su, and Q. M. Huang, Rethinking graph neural architecture search from message-passing, in Proc. IEEE/CVF Conf. Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 2021, pp. 6657-6666.
DOI
[61]
J. Zhou, G. Q. Cui, S. D. Hu, Z. Y. Zhang, C. Yang, Z. Y. Liu, L. F. Wang, C. C. Li, and M. S. Sun, Graph neural networks: A review of methods and applications, AI Open, vol. 1, pp. 57-81, 2020.
[62]
K. Xu, C. T. Li, Y. L. Tian, T. Sonobe, K. I. Kawarabayashi, and S. Jegelka, Representation learning on graphs with jumping knowledge networks, in Proc. 35th Int. Conf. Machine Learning, Stockholm, Sweden, 2018, pp. 5453-5462.
[63]
H. Pham, M. Guan, B. Zoph, Q. Le, and J. Dean, Efficient neural architecture search via parameters sharing, in Proc.35th Int. Conf. Machine Learning, Stockholm, Sweden, 2018, pp. 4095-4104.
[64]
H. P. Zhou, M. H. Yang, J. Wang, and W. Pan, BayesNAS: A Bayesian approach for neural architecture search, in Proc. 36th Int. Conf. Machine Learning, Long Beach, CA, USA, 2019, pp. 7603-7613.
[65]
A. Noy, N. Nayman, T. Ridnik, N. Zamir, S. Doveh, I. Friedman, R. Giryes, and L. Zelnik, ASAP: Architecture search, anneal and prune, in Proc. 23rd Int. Conf. Artificial Intelligence and Statistics, Palermo, Italy, 2020, pp. 493-503.
[66]
K. H. Lai, D. C. Zha, K. X. Zhou, and X. Hu, Policy-GNN: Aggregation optimization for graph neural networks, in Proc. 26thACM SIGKDD Int. Conf. Knowledge Discovery & Data Mining, Virtual Event, CA, USA, 2020, pp. 461-471.
DOI
[67]
Z. C. Guo, X. Y. Zhang, H. Y. Mu, W. Heng, Z. C. Liu, Y. C. Wei, and J. Sun, Single path one-shot neural architecture search with uniform sampling, in Proc. 16th European Conf. Computer Vision, Glasgow, UK, 2020, pp. 544-560.
DOI
[68]
W. Wen, H. X. Liu, Y. R. Chen, H. Li, G. Bender, and P. J. Kindermans, Neural predictor for neural architecture search, in Proc. 16th European Conf. Computer Vision, Glasgow, UK, 2020, pp. 660-676.
DOI
[69]
X. W. Zheng, R. R. Ji, Q. Wang, Q. X. Ye, Z. G. Li, Y. H. Tian, and Q. Tian, Rethinking performance estimation in neural architecture search, in Proc. 2020 IEEE/CVF Conf. Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 2020, pp. 11353-11362.
DOI
[70]
A. Klein, S. Falkner, J. T. Springenberg, and F. Hutter, Learning curve prediction with Bayesian neural networks, in Proc. 5th Int. Conf. Learning Representations, Toulon, France, 2017, pp. 2001-2016.
[71]
P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad, Collective classification in network data, AI Magazine, vol. 29, no. 3, pp. 93-106, 2008.
[72]
L. Tang and H. Liu, Relational learning via latent social dimensions, in Proc. 15th ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, Paris, France, 2009, pp. 817-826.
DOI
[73]
P. Mernyei and C. Cangea, Wiki-CS: A wikipedia-based benchmark for graph neural networks, arXiv preprint arXiv: 2007.02901, 2020.
[74]
M. Zitnik and J. Leskovec, Predicting multicellular function through multi-layer tissue networks, Bioinformatics, vol. 33, no. 14, pp. i190-i198, 2017.
[75]
K. M. Borgwardt, C. S. Ong, S. Schönauer, S. V. N. Vishwanathan, A. J. Smola, and H. P. Kriegel, Protein function prediction via graph kernels, Bioinformatics, vol. 21, no. Suppl 1, pp. i47-i56, 2005.
[76]
I. Schomburg, A. Chang, C. Ebeling, M. Gremse, C. Heldt, G. Huhn, and D. Schomburg, Brenda, the enzyme database: Updates and major new developments, Nucleic Acids Res., vol. 32, pp. D431-D433, 2004.
[77]
P. D. Dobson and A. J. Doig, Distinguishing enzyme structures from non-enzymes without alignments, J. Mol. Biol., vol. 330, no. 4, pp. 771-783, 2003.
[78]
M. Fey, J. E. Lenssen, F. Weichert, and H. Müller, SplineCNN: Fast geometric deep learning with continuous B-spline kernels, in Proc. 2018 IEEE/CVF Conf. Computer Vision and Pattern Recognition (CPVR), Salt Lake City, UT, USA, 2018, pp. 869-877.
DOI
[79]
C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, and M. Grohe, Weisfeiler and leman go neural: Higher-order graph neural networks, Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 4602-4609, 2019.
[80]
F. M. Bianchi, D. Grattarola, L. Livi, and C. Alippi, Graph neural networks with convolutional ARMA filters, arXiv preprint arXiv: 1901.01343, 2021.
[81]
M. J. Wang, D. Zheng, Z. H. Ye, Q. Gan, M. F. Li, X. Song, J. J. Zhou, C. Ma, L. F. Yu, Y. Gai, et al., Deep graph library: A graph-centric, highly-performant package for graph neural networks, arXiv preprint arXiv: 1909.01315, 2020.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 15 May 2021
Revised: 15 July 2021
Accepted: 30 July 2021
Published: 09 December 2021
Issue date: August 2022

Copyright

© The author(s) 2022

Acknowledgements

The work was supported by the National Natural Science Foundation of China (No. 61873288), and the CAAI-Huawei MindSpore Open Fund**.

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return