Journal Home > Volume 4 , Issue 4

As a powerful tool for elucidating the embedding representation of graph-structured data, Graph Neural Networks (GNNs), which are a series of powerful tools built on homogeneous networks, have been widely used in various data mining tasks. It is a huge challenge to apply a GNN to an embedding Heterogeneous Information Network (HIN). The main reason for this challenge is that HINs contain many different types of nodes and different types of relationships between nodes. HIN contains rich semantic and structural information, which requires a specially designed graph neural network. However, the existing HIN-based graph neural network models rarely consider the interactive information hidden between the meta-paths of HIN in the poor embedding of nodes in the HIN. In this paper, we propose an Attention-aware Heterogeneous graph Neural Network (AHNN) model to effectively extract useful information from HIN and use it to learn the embedding representation of nodes. Specifically, we first use node-level attention to aggregate and update the embedding representation of nodes, and then concatenate the embedding representation of the nodes on different meta-paths. Finally, the semantic-level neural network is proposed to extract the feature interaction relationships on different meta-paths and learn the final embedding of nodes. Experimental results on three widely used datasets showed that the AHNN model could significantly outperform the state-of-the-art models.


menu
Abstract
Full text
Outline
About this article

Attention-Aware Heterogeneous Graph Neural Network

Show Author's information Jintao ZhangQuan Xu( )
College of Sciences, Northeastern University, Shenyang 110004, China
State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, Shenyang 110819, China

Abstract

As a powerful tool for elucidating the embedding representation of graph-structured data, Graph Neural Networks (GNNs), which are a series of powerful tools built on homogeneous networks, have been widely used in various data mining tasks. It is a huge challenge to apply a GNN to an embedding Heterogeneous Information Network (HIN). The main reason for this challenge is that HINs contain many different types of nodes and different types of relationships between nodes. HIN contains rich semantic and structural information, which requires a specially designed graph neural network. However, the existing HIN-based graph neural network models rarely consider the interactive information hidden between the meta-paths of HIN in the poor embedding of nodes in the HIN. In this paper, we propose an Attention-aware Heterogeneous graph Neural Network (AHNN) model to effectively extract useful information from HIN and use it to learn the embedding representation of nodes. Specifically, we first use node-level attention to aggregate and update the embedding representation of nodes, and then concatenate the embedding representation of the nodes on different meta-paths. Finally, the semantic-level neural network is proposed to extract the feature interaction relationships on different meta-paths and learn the final embedding of nodes. Experimental results on three widely used datasets showed that the AHNN model could significantly outperform the state-of-the-art models.

Keywords: Graph Neural Network (GNN), Heterogeneous Information Network (HIN), embedding

References(22)

[1]
D. Duvenaud, D. Maclaurin, J. Aguilera-Iparraguirre, R. Gómez-Bombarelli, T. Hirzel, A. Aspuru-Guzik, and R. P. Adams, Convolutional networks on graphs for learning molecular fingerprints, in Proc. 28th Int. Conf. Neural Information Processing Systems, Montreal, Canada, 2015, pp. 2224-2232.
[2]
J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, Neural message passing for quantum chemistry, in Proc. 34th Int. Conf. Machine Learning, Sydney, Australia, 2017, pp. 1263-1277.
[3]
M. D. Cranmer, R. Xu, P. Battaglia, and S. Ho, Learning symbolic physics with graph networks, arXiv preprint arXiv: 1909.05862, 2019.
[4]
A. Sanchez-Gonzalez, N. Heess, J. T. Springenberg, J. Merel, M. Riedmiller, R. Hadsell, and P. Battaglia, Graph networks as learnable physics engines for inference and control, in Proc. 35th Int. Conf. Machine Learning, Stockholm, Sweden, 2018, pp. 4470-4479.
[5]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, in Proc. 5th Int. Conf. Learning Representations, Toulon, France, 2017, https://openreview.net/forum?id=SJU4ayYgl.
[6]
X. N. He, K. Deng, X. Wang, Y. Li, Y. D. Zhang, and M. Wang, LightGCN: Simplifying and powering graph convolution network for recommendation, in Proc. 43rdInt. ACM SIGIR Conf. Research and Development in Information Retrieval, .
DOI
[7]
V. P. Dwivedi, C. K. Joshi, T. Laurent, Y. Bengio, and X. Breson, Benchmarking graph neural networks, arXiv preprint arXiv: 2003.00982, 2020.
[8]
Z. W. Zhang, P. Cui, and W. W. Zhu, Deep learning on graphs: A survey, IEEE Transactions on Knowledge and Data Engineering, .
[9]
P. Velickovie, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, Graph attention networks, in Proc. 6th Int. Conf. Learning Representations, Vancouver, Canada, 2018, https://openreview.net/forum?id=rJXMpikCZ.
[10]
K. Xu, W. H. Hu, J. Leskovec, and S. Jegelka, How powerful are graph neural networks? in Proc. 7th Int. Conf. Learning Representations, New Orleans, LA, USA, 2019, https://openreview.net/forum?id=ryGs6iA5Km.
[11]
H. M. Zhu, F. L. Feng, X. N. He, X. Wang, Y. Li, K. Zheng, and Y. D. Zhang, Bilinear graph neural network with neighbor interactions, in Proc. 29th Int. Joint Conf. Artificial Intelligence, Yokohama, Japan, 2020, pp. 1452-1458.
DOI
[12]
C. Shi, B. B. Hu, W. X. Zhao, and P. S. Yu, Heterogeneous information network embedding for recommendation, IEEE Transactions on Knowledge and Data Engineering, vol. 31, no. 2, pp. 357-370, 2019.
[13]
X. Wang, M. Q. Zhu, D. Y. Bo, P. Cui, C. Shi, and J. Pei, AM-GCN: Adaptive multi-channel graph convolutional networks, in Proc. 26th ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, San Diego, CA, USA, 2020, pp. 1243-1253.
DOI
[14]
X. Wang, H. Y. Ji, C. Shi, B. Wang, Y. F. Ye, P. Cui, and P. S. Yu, Heterogeneous graph attention network, in Proc. of the World Wide Web Conf., San Francisco, CA, USA, 2019, pp. 2022-2032.
DOI
[15]
W. J. Chen, Y. L. Gu, Z. C. Ren, X. N. He, H. T. Xie, T. Guo, D. W. Yin, and Y. D. Zhang, Semi-supervised user profiling with heterogeneous graph attention networks, in Proc. 28th Int. Joint Conf. Artificial Intelligence, Macao, China, 2019, pp. 2116-2122.
DOI
[16]
H. T. Hong, H. T. Guo, Y. C. Lin, X. Q. Yang, Z. Li, and J. P. Ye, An attention-based graph neural network for heterogeneous structural learning, in Proc. 34th AAAI Conf. Artificial Intelligence, New York, NY, USA, 2020, pp. 4132-4139.
DOI
[17]
S. Zhang and L. Xie, Improving attention mechanism in graph neural networks via cardinality preservation, in Proc. 29th Int. Joint Conf. Artificial Intelligence, New York, NY, USA, 2020, pp. 1395-1402.
DOI
[18]
B. Perozzi, R. Al-Rfou, and S. Skiena, DeepWalk: Online learning of social representations, in Proc. 20th ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, New York, NY, USA, 2014, pp. 701-710.
DOI
[19]
J. Tang, M. Qu, M. Z. Wang, M. Zhang, J. Yan, and Q. Z. Mei, LINE: Large-scale information network embedding, in Proc. 24th Int. Conf. World Wide Web, Florence, Italy, 2015, pp. 1067-1077.
DOI
[20]
Y. X. Dong, N. V. Chawla, and A. Swami, Metapath2vec: Scalable representation learning for heterogeneous networks, in Proc. 23rd ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, Halifax, Canada, 2017, pp. 135-144.
DOI
[21]
Z. N. Hu, Y. X. Dong, K. S. Wang, and Y. Z. Sun, Heterogeneous graph transformer, in Proc. Web Conf., Taipei, China, 2020, pp. 2704-2710.
DOI
[22]
X. Y. Fu, J. N. Zhang, Z. Q. Meng, and I. King, MAGNN: Metapath aggregated graph neural network for heterogeneous graph embedding, in Proc. Web Conf., Taipei, China, 2020, pp. 2331-2341.
DOI
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 21 February 2021
Revised: 25 April 2021
Accepted: 28 April 2021
Published: 26 August 2021
Issue date: December 2021

Copyright

© The author(s) 2021

Acknowledgements

This work was supported by the Key Scientific Guiding Project for the Central Universities Research Funds (No. N2008005), the Major Science and Technology Project of Liaoning Province of China (No. 2020JH1/10100008), and the National Key Research and Development Program of China (No. 2018YFB1701104).

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return