Journal Home > Volume 25 , Issue 4

Classifying large-scale networks into several categories and distinguishing them according to their fine structures is of great importance to several real-life applications. However, most studies on complex networks focus on the properties of a single network and seldom on classification, clustering, and comparison between different networks, in which the network is treated as a whole. Conventional methods can hardly be applied on networks directly due to the non-Euclidean properties of data. In this paper, we propose a novel framework of Complex Network Classifier (CNC) by integrating network embedding and convolutional neural network to tackle the problem of network classification. By training the classifier on synthetic complex network data, we show CNC can not only classify networks with high accuracy and robustness but can also extract the features of the networks automatically. We also compare our CNC with baseline methods on benchmark datasets, which shows that our method performs well on large-scale networks.


menu
Abstract
Full text
Outline
About this article

Complex Network Classification with Convolutional Neural Network

Show Author's information Ruyue XinJiang Zhang( )Yitong Shao
School of Systems Science, Beijing Normal University, Beijing 100875, China.
School of Mathematical Sciences, Beijing Normal University, Beijing 100875, China.

Abstract

Classifying large-scale networks into several categories and distinguishing them according to their fine structures is of great importance to several real-life applications. However, most studies on complex networks focus on the properties of a single network and seldom on classification, clustering, and comparison between different networks, in which the network is treated as a whole. Conventional methods can hardly be applied on networks directly due to the non-Euclidean properties of data. In this paper, we propose a novel framework of Complex Network Classifier (CNC) by integrating network embedding and convolutional neural network to tackle the problem of network classification. By training the classifier on synthetic complex network data, we show CNC can not only classify networks with high accuracy and robustness but can also extract the features of the networks automatically. We also compare our CNC with baseline methods on benchmark datasets, which shows that our method performs well on large-scale networks.

Keywords: complex network, Convolutional Neural Network (CNN), DeepWalk, network classification

References(34)

[1]
S. N. Dorogovtsev and J. F. F. Mendes, Evolution of networks, Advances in Physics, vol. 51, no. 4, pp. 1079-1187, 2002.
[2]
R. Albert and A. L. Barabási, Statistical mechanics of complex networks, Reviews of Modern Physics, vol. 74, no. 1, p. 47, 2002.
[3]
R. A. Hanneman and M. Riddle, Introduction to Social Network Methods. Riverside, CA, USA: University of California, 2005.
[4]
A. Krizhevsky, I. Sutskever, and G. E. Hinton, Imagenet classification with deep convolutional neural networks, in Proc. of Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 2012, pp. 1097-1105.
[5]
A. Graves, A. Mohamed, and G. Hinton, Speech recognition with deep recurrent neural networks, in Proc. of 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Vancouver, Canada, 2013, pp. 6645-6649.
DOI
[6]
P. Yanardag and S. V. N. Vishwanathan, Deep graph kernels, in Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Sydney, Australia, 2015, pp. 1365-1374.
DOI
[7]
S. T. Roweis and L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science, vol. 290, no. 5500, pp. 2323-2326, 2000.
[8]
J. B. Tenenbaum, V. De Silva, and J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science, vol. 290, no. 5500, pp. 2319-2323, 2000.
[9]
B. Perozzi, R. Al-Rfou, and S. Skiena, Deepwalk: Online learning of social representations, in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 2014, pp. 701-710.
DOI
[10]
A. Grover and J. Leskovec, node2vec: Scalable feature learning for networks, in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 2016, pp. 855-864.
DOI
[11]
J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q Mei, Line: Large-scale information network embedding, in Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, 2015, pp. 1067-1077.
DOI
[12]
F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, The graph neural network model, IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61-80, 2008.
[13]
Y. Li, D. Tarlow, M. Brockschmidt, and R. Zemel, Gated graph sequence neural networks, arXiv preprint arXiv: 1511.05493, 2015.
[14]
M. Defferrard, X. Bresson, and P. Vandergheynst, Convolutional neural networks on graphs with fast localized spectral filtering, in Proc. of Advances in Neural Information Processing Systems, Barcelona, Spain, 2016, pp. 3844-3852.
[15]
S. H. Strogatz, Exploring complex networks, Nature, vol. 410, no. 6825, p. 268, 2001.
[16]
E. N. Gilbert, Random graphs, The Annals of Mathematical Statistics, vol. 30, no. 4, pp. 1141-1144, 1959.
[17]
D. J. Watts and S. H. Strogatz, Collective dynamics of small-worldnetworks, Nature, vol. 393, no. 6684, p. 440, 1998.
[18]
A. L. Barabási and R. Albert, Emergence of scaling in random networks, Science, vol. 286, no. 5439, pp. 509- 512, 1999.
[19]
H. Kashima, K. Tsuda, and A. Inokuchi, Marginalized kernels between labeled graphs, in Proceedings of the 20th International Conference on Machine Learning (ICML-03), Atlanta, GA, USA, 2003, pp. 321-328.
[20]
K. M. Borgwardt and H. P. Kriegel, Shortest-path kernels on graphs, in Proceedings of the Fifth IEEE International Conference on Data Mining (ICDM’05), Washington, DC, USA, 2005, p. 8.
[21]
N. Shervashidze, S. V. N. Vishwanathan, T. H. Petri, K. Mehlhorn, and K. M. Borgwardt, Efficient graphlet kernels for large graph comparison, in Proc. of Artificial Intelligence and Statistics, Clearwater Beach, FL, USA, 2009, pp. 488-495.
[22]
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt, Weisfeiler-Lehman graph kernels, Journal of Machine Learning Research, vol. 12, no. Sep, pp. 2539-2561, 2011.
[23]
K. Simonyan and A. Zisserman, Very deep convolutional networks for large-scale image recognition, arXiv preprint arXiv: 1409.1556, 2014.
[24]
J. Long, E. Shelhamer, and T. Darrell, Fully convolutional networks for semantic segmentation, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 2015, pp. 3431-3440.
DOI
[25]
N. Kalchbrenner, E. Grefenstette, and P. Blunsom, A convolutional neural network for modelling sentences, arXiv preprint arXiv: 1404.2188, 2014.
[26]
J. Bruna, W. Zaremba, A. Szlam, Y. LeCun, Spectral networks and locally connected networks on graphs, arXiv preprint arXiv: 1312.6203, 2013.
[27]
M. Henaff, J. Bruna, and Y. LeCun, Deep convolutional networks on graph-structured data, arXiv preprint arXiv: 1506.05163, 2015.
[28]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, arXiv preprint, arXiv: 1609.02907, 2016.
[29]
W. Gu, L. Gong, X. Lou, and J. Zhang, The hidden flow structure and metric space of network embedding algorithms based on random walks, Scientific Reports, vol. 7, no. 1, p. 13114, 2017.
[30]
L. I. Smith, A tutorial on principal components analysis, http://www.cs.otago.ac.nz/cosc453/student_tutorials/prin-cipal_components.pdf, 2002.
[31]
X. F. Wang and G. Chen, Complex networks: Small-world, scale-free and beyond, IEEE Circuits and Systems Magazine, vol. 3, no. 1, pp. 6-20, 2003.
[32]
N. Wale, I. A. Watson, and G. Karypis, Comparison of descriptor spaces for chemical compound retrieval and classification, Knowledge and Information Systems, vol. 14, no. 3, pp. 347-375, 2008.
[33]
J. Leskovec, J. Kleinberg, and C. Faloutsos, Graphs over time: Densification laws, shrinking diameters and possible explanations, in Proceedings of the Eleventh ACM SIGKDD International Conference on Knowledge Discovery in Data Mining, Chicago, IL, USA, 2005, pp. 177-187.
DOI
[34]
M. Niepert, M. Ahmed, and K. Kutzkov, Learning convolutional neural networks for graphs, in Proceedings of International Conference on Machine Learning, New York, NY, USA, 2016, pp. 2014-2023.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 18 January 2019
Revised: 12 July 2019
Accepted: 09 September 2019
Published: 13 January 2020
Issue date: August 2020

Copyright

© The author(s) 2020

Acknowledgements

The work was supported by the National Natural Science Foundation of China (No. 61673070) and Beijing Normal University Interdisciplinary Project. The authors would like to thank the referees for their valuable comments and helpful suggestions.

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return