Journal Home > Volume 24 , Issue 1

Echo State Network (ESN) is a recurrent neural network with a large, randomly generated recurrent part called the dynamic reservoir. Only the output weights are modified during training. However, proper balancing of the trade-off between the structure and performance for ESN remains a difficult task. In this paper, a structure optimized method for ESN based on contribution is proposed to simplify its network structure and improve its performance. First, we evaluate the contribution of reservoir neurons. Second, we present a pruning mechanism to remove the unimportant connection weights of reservoir neurons with low contribution. Finally, the new output weights are learned with the pseudo inverse method. The novel optimized ESN, named C-ESN, is tested on a Lorenz chaotic time-series prediction and an actual municipal sewage treatment system. The simulation results show that the C-ESN can have better prediction and generalization performance than ESN.


menu
Abstract
Full text
Outline
About this article

Structure Optimization for Echo State Network Based on Contribution

Show Author's information Dingyuan LiFu Liu( )Junfei QiaoRong Li
College of Communication Engineering, Jilin University, Changchun 130022, China.
Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China.
Department of Information Technology, Beijing Vocational College of Agriculture, Beijing 100124, China.

Abstract

Echo State Network (ESN) is a recurrent neural network with a large, randomly generated recurrent part called the dynamic reservoir. Only the output weights are modified during training. However, proper balancing of the trade-off between the structure and performance for ESN remains a difficult task. In this paper, a structure optimized method for ESN based on contribution is proposed to simplify its network structure and improve its performance. First, we evaluate the contribution of reservoir neurons. Second, we present a pruning mechanism to remove the unimportant connection weights of reservoir neurons with low contribution. Finally, the new output weights are learned with the pseudo inverse method. The novel optimized ESN, named C-ESN, is tested on a Lorenz chaotic time-series prediction and an actual municipal sewage treatment system. The simulation results show that the C-ESN can have better prediction and generalization performance than ESN.

Keywords: neural network, structural design, time-series prediction

References(16)

[1]
Jaeger H. and Haas H., Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication, Science, vol. 304, no. 5667, pp. 78-80, 2004.
[2]
Jaeger H., The “echo state” approach to analysing and training recurrent neural networks, GMD Report, German National Research Center for Information Technology, Sankt Augustin, Germany, 2001.
[3]
Tong M. H., Bickett A. D., Christiansen E. M., and Cottrell G. W., Learning grammatical structure with Echo State Networks, Neural Networks, vol. 20, no. 3, pp. 424-432, 2007.
[4]
Gideon K., Nyirenda C., and Nyah C. T., Echo state network-based radio signal strength prediction for wireless communication in Northern Namibia, IET Communications, vol. 11, no. 12, pp. 1920-1926, 2017.
[5]
Coulibaly P., Reservoir computing approach to Great Lakes water level forecasting, Journal of Hydrology, vol. 381, nos. 1&2, pp. 76-88, 2010.
[6]
Meftah B., Lézoray O., and Benyettou A., Novel approach using echo state networks for microscopic cellular image segmentation, Cognitive Computation, vol. 8, no. 2, pp. 237-245, 2016.
[7]
Salmen M. and Ploger P. G., Echo state networks used for motor control, in Proc. IEEE International Conference on Robotics and Automation, Barcelona, Spain, 2006, pp. 1953-1958.
[8]
Wang H. S. and Yan X. F., Improved simple deterministically constructed Cycle Reservoir Network with Sensitive Iterative Pruning Algorithm, Neurocomputing, vol. 145, pp. 353-362, 2014.
[9]
Rodan A. and Tino P., Minimum complexity echo state network, IEEE Transactions on Neural Networks, vol. 22, no. 1, pp. 131-144, 2011.
[10]
Scardapane S., Nocco G., Comminiello D., Scarpiniti M., and Uncini A., An effective criterion for pruning reservoir’s connections in Echo State Networks, in Proc. International Joint Conference on Neural Networks, Beijing, China, 2015, pp. 1205-1212.
DOI
[11]
Dutoit X., Schrauwen B., Campenhout J. V., Stoobandt D., Van Brussel H., and Nuttion M., Pruning and regularization in reservoir computing, Neurocomputing, vol. 72, nos. 7–9, pp. 1534-1546, 2009.
[12]
Chouikhi N., Ammar B., Rokbani N., Alimi A. M., and Abraham A., A hybrid approach based on particle swarm optimization for echo state network initialization, in Proc. IEEE International Conference on Systems, Man, and Cybernetics, Kowloon, China, 2016, pp. 2896-2901.
DOI
[13]
Ferreira A. A. and Ludermir T. B., Comparing evolutionary methods for reservoir computing pre-training, in Proc. International Joint Conference on Neural Networks, San Jose, CA, USA, 2011, pp. 283-290.
DOI
[14]
Kobialka H. U. and Kayani U., Echo state networks with sparse output connections, in Proc. 20th Int. Conference on Artificial Neural Networks: Part I, Thessaloniki, Greece, 2010, pp. 356-361.
DOI
[15]
Shannon C. E., A mathematical theory of communication, ACM SIGMOBILE Mobile Computing and Communications Review, vol. 5, no. 1, pp. 3-55, 2001.
[16]
Gharbi A. H., Ravier P., Harba R., and Mohamadi T., Low bias histogram-based estimation of mutual information for feature selection, Pattern Recognition Letters, vol. 33, no. 10, pp. 1302-1308, 2012.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 03 January 2018
Accepted: 10 January 2018
Published: 08 November 2018
Issue date: February 2019

Copyright

© The author(s) 2019

Acknowledgements

This work was supported by the National Natural Science Foundation of China (No. 61225016) and the Key Project of National Natural Science Foundation of China (No. 61533002).

Rights and permissions

Return