References(38)
[1]
Löfström T., On effectively creating ensembles of classifiers: Studies on creation strategies, diversity and predicting with confidence, PhD dissertation, Dept. Comput. Syst. Sci., Stockholm University, Swedish, 2015.
[2]
Perez-Diaz N., Ruano-Ordas D., Fdez-Riverola F., and Mendez J. R., Boosting accuracy of classical machine learning antispam classifiers in real scenarios by applying rough set theory, Sci. Program., .
[3]
Breiman L., Bagging Predictors. Boston, MA, USA: Kluwer Academic Publishers, 1996.
[4]
Freund Y., Schapire R., and Abe N., A short introduction to Boosting, J. Japanese Soc. Artificial Intelligence, vol. 14, no. 5, pp. 771-780, 1999.
[5]
Freund Y. and Schapire R. E., Experiments with a new Boosting algorithm, in Proc. 13th Int. Conf. Machine Learning, Bari, Italy, 1996, pp. 1-9.
[6]
Luo X., Liu J., Zhang D. D., and Chang X. H., A large-scale web QoS prediction scheme for the industrial Internet of Things based on a kernel machine learning algorithm, Comput. Networks, vol. 101, pp. 81-89, 2016.
[7]
Luo X., Deng J., Liu J., Wang W. P., Ban X. J., and Wang J. H., A quantized kernel least mean square scheme with entropy-guided learning for intelligent data analysis, China Commun., vol. 14, no. 7, pp. 127-136, 2017.
[8]
Miche Y., Sorjamaa A., Bas P., Simula O., Jutten C., and Lendasse A., OP-ELM: Optimally pruned extreme learning machine, IEEE Trans. Neural Networks, vol. 21, no. 1, pp. 158-162, 2010.
[9]
Huang G. B., Zhou H., Ding X., and Zhang R., Extreme learning machine for regression and multiclass classification, IEEE Trans. Syst. Man Cybern. Part B Cybern., vol. 42, no. 2, pp. 513-529, 2012.
[10]
Atiquzzaman M. and Kandasamy J., Prediction of hydrological time-series using extreme learning machine, J. Hydroinform., vol. 18, no. 2, pp. 345-353, 2016.
[11]
Xu Y., Luo X., Wang W. P., and Zhao W. B., Efficient DV-HOP localization for wireless cyber-physical social sensing system: A correntropy-based neural network learning scheme, Sensors, vol. 17, no. 1, p. 135, 2017.
[12]
Luo X., Zhang D. D., Yang L. T., Liu J., Chang X. H., and Ning H. S., A kernel machine-based secure data sensing and fusion scheme in wireless sensor networks for the cyber-physical systems, Future Gener. Comput. Syst., vol. 61, pp. 85-96, 2016.
[13]
Deng W., Zheng Q., and Chen L., Regularized extreme learning machine, in Proc. IEEE Symp. Comput. Intell. Data Min., Nashville, TN, USA, 2009, pp. 389-395.
[14]
Huang G. B., Ding X., and Zhou H., Optimization method based extreme learning machine for classification, Neurocomputing, vol. 74, nos. 1–3, pp. 155-163, 2010.
[15]
Luo X. and Chang X. H., A novel data fusion scheme using grey model and extreme learning machine in wireless sensor networks, Int. J. Control Autom. Syst., vol. 13, no. 3, pp. 539-546, 2015.
[16]
Luo X., Chang X. H., and Ban X. J., Regression and classification using extreme learning machine based on L1-norm and L2-norm, Neurocomputing, vol. 174, pp. 179-186, 2016.
[17]
Yu H., Yuan Y., Yang X., and Dan Y., A dynamic generation approach for ensemble of extreme learning machines, Lect. Notes Comput. Sci., vol. 8866, pp. 294-302, 2014.
[18]
Liu N. and Wang H., Ensemble based extreme learning machine, IEEE Signal Process. Lett., vol. 17, no. 8, pp. 754-757, 2010.
[19]
Samat A., Du P., Liu S., Li J., and Cheng L., E2LMs: Ensemble extreme learning machines for hyperspectral image classification, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., vol. 7, no. 4, pp. 1060-1069, 2014.
[20]
Wang H., He Q., Shang T., Zhuang F., and Shi Z., Extreme learning machine ensemble classifier for large-scale data, in Proc. ELM, Singapore, 2014, pp. 151-161.
[21]
Huang S., Wang B., Qiu J., Yao J., Wang G., and Yu G., Parallel ensemble of online sequential extreme learning machine based on MapReduce, Neurocomputing, vol. 174, pp. 352-367, 2014.
[22]
Cao J., Chen T., and Fan J., Landmark recognition with compact BoW histogramand ensemble ELM, Multimed. Tools Appl., vol. 75, no. 5, pp. 2839-2857, 2016.
[23]
Jin Y., Cao J., Wang Y., and Zhi R., Ensemble based extreme learning machine for cross-modality face matching, Multimed. Tools Appl., vol. 75, no. 19, pp. 1- 16, 2016.
[24]
Wang X. L., Chen Y. Y., Zhao H., and Lu B. L., Parallelized extreme learning machine ensemble based on min max modular network, Neurocomputing, vol. 128, no. 5, pp. 31-41, 2014.
[25]
Abuassba A. O. M., Zhang D., Luo X., Shaheryar A., and Ali H., Hazrat, Improving classification performance through an advanced ensemble based heterogeneous extreme learning machines, Comput. Intell. Neurosci., .
[26]
Lu H., Zhang J. W., Ma X., and Zheng W., Tumor classification using extreme learning machine ensemble, (in Chinese), Math. Pract. Theory, vol. 42, no. 17, pp. 148-154, 2012.
[27]
Liu Y. and Yao X., Ensemble learning via negative correlation, Neural Netw., vol. 12, no. 10, pp. 1399-1404, 1999.
[28]
Xing H. J. and Wang X. M., Training extreme learning machine via regularized correntropy criterion, Neural Comput. Appl., vol. 23, nos. 7&8, pp. 1977-1986, 2013.
[29]
Li K., Kong X., Lu Z., Liu W., and Yin J., Boosting weighted ELM for imbalanced learning, Neurocomputing, vol. 128, pp. 15-21, 2014.
[30]
Jiang Y., Shen Y., Liu Y., and Liu W., Multiclass Adaboost ELM and its application in LBP based face recognition, Math. Probl. Eng., .
[31]
Zhang Y., Wu J., Cai Z., Zhang P., and Chen L., Memetic extreme, learning machine, Pattern Recogn., vol. 58, pp. 135-148, 2016.
[32]
Santamaia I., Pokharel P. P., and Principe J. C., Generalized correlation function: Definition, properties and application to blind equalization, IEEE Trans. Signal Process., vol. 54, no. 6, pp. 2187-2197, 2006.
[33]
Liu W., Pokharel P. P., and Principe J. C., Correntropy: Properties and applications in non-Gaussian signal processing, IEEE Trans. Signal Process., vol. 55, no. 11, pp. 5286-5298, 2007.
[34]
Vapnik V., The Nature of Statistical Learning Theory. Berlin, Germany: Springer, 1995.
[35]
Luo X., Deng J., Wang W. P., Wang J. H., and Zhao W. B., A quantized kernel learning algorithm using a minimum kernel risk-sensitive loss criterion and bilateral gradient technique, Entropy, vol. 19, no. 7, p. 365, 2017.
[36]
Wang G. and Li P., Dynamical Adaboost ensemble extreme learning machine, in Proc. 3rd Int. Conf. Adv. Comput. Theory Eng., Chengdu, China, 2010, pp. V354-V358.
[37]
Sachnev V., Ramasamy S., Sundaram S., Kim H. J., and Hwang H. J., A cognitive ensemble of extreme learning machines for steganalysis based on risk-sensitive hinge loss function, Cognitive Comput., vol. 7, no. 1, pp. 103-110, 2014.