Journal Home > Volume 28 , Issue 2

Cross-domain emotion classification aims to leverage useful information in a source domain to help predict emotion polarity in a target domain in a unsupervised or semi-supervised manner. Due to the domain discrepancy, an emotion classifier trained on source domain may not work well on target domain. Many researchers have focused on traditional cross-domain sentiment classification, which is coarse-grained emotion classification. However, the problem of emotion classification for cross-domain is rarely involved. In this paper, we propose a method, called convolutional neural network (CNN) based broad learning, for cross-domain emotion classification by combining the strength of CNN and broad learning. We first utilized CNN to extract domain-invariant and domain-specific features simultaneously, so as to train two more efficient classifiers by employing broad learning. Then, to take advantage of these two classifiers, we designed a co-training model to boost together for them. Finally, we conducted comparative experiments on four datasets for verifying the effectiveness of our proposed method. The experimental results show that the proposed method can improve the performance of emotion classification more effectively than those baseline methods.


menu
Abstract
Full text
Outline
About this article

CNN-Based Broad Learning for Cross-Domain Emotion Classification

Show Author's information Rong Zeng1Hongzhan Liu1( )Sancheng Peng2( )Lihong Cao2Aimin Yang3Chengqing Zong4Guodong Zhou5
Guangdong Provincial Key Laboratory of Nanophotonic Functional Materials and Devices, South China Normal University, Guangzhou 511400, China
Laboratory of Language Engineering and Computing, Guangdong University of Foreign Studies, Guangzhou 510006, China
School of Computer Science and Intelligence Education, Lingnan Normal University, Guangzhou 510006, China
National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China
School of Computer Science and Technology, Soochow University, Suzhou 215031, China

Abstract

Cross-domain emotion classification aims to leverage useful information in a source domain to help predict emotion polarity in a target domain in a unsupervised or semi-supervised manner. Due to the domain discrepancy, an emotion classifier trained on source domain may not work well on target domain. Many researchers have focused on traditional cross-domain sentiment classification, which is coarse-grained emotion classification. However, the problem of emotion classification for cross-domain is rarely involved. In this paper, we propose a method, called convolutional neural network (CNN) based broad learning, for cross-domain emotion classification by combining the strength of CNN and broad learning. We first utilized CNN to extract domain-invariant and domain-specific features simultaneously, so as to train two more efficient classifiers by employing broad learning. Then, to take advantage of these two classifiers, we designed a co-training model to boost together for them. Finally, we conducted comparative experiments on four datasets for verifying the effectiveness of our proposed method. The experimental results show that the proposed method can improve the performance of emotion classification more effectively than those baseline methods.

Keywords: CNN, broad learning, cross-domain emotion classification, classifier, co-training

References(35)

[1]
Y. Ganin, E. Ustinova, H. Ajakan, P. Germain, H. Larochelle, F. Laviolette, M. Marchand, and V. Lempitsky, Domain-adversarial training of neural networks, J. Mach. Learn. Res., vol. 17, no. 3, pp. 1–35, 2016.
[2]
R. D. He, W. S. Lee, H. T. Ng, and D. Dahlmeier, Adaptive semi-supervised learning for cross-domain sentiment classification, in Proc. 2018 Conf. on Empirical Methods in Natural Language Processing, Brussels, Belgium, 2018, pp. 3467–3476.
[3]
J. L. Ji, C. Q. Luo, X. H. Chen, L. X. Yu, and P. Li, Cross-domain sentiment classification via a bifurcated-LSTM, in Proc. 2nd Pacific-Asia Conf. on Knowledge Discovery and Data Mining, Melbourne, Australia, 2018, pp. 681–693.
[4]
J. Zhou, J. F. Tian, R. Wang, Y. B. Wu, W. M. Xiao, and L. He, SENTIX: A sentiment-aware pre-trained model for cross-domain sentiment analysis, in Proc. 28th Int. Conf. on Computational Linguistics, Barcelona, Spain, 2020, pp. 568–579.
[5]
Z. Li, X. Li, Y. Wei, L. D. Bing, Y. Zhang, and Q. Yang, Transferable end-to-end aspect-based sentiment analysis with selective adversarial learning, in Proc. 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int. Joint Conf. on Natural Language Processing, Hong Kong, China, 2019, pp. 4590–4600.
[6]
M. L. Peng and Q. Zhang, Weighed domain-invariant representation learning for cross-domain sentiment analysis, in Proc. 28th Int. Conf. on Computational Linguistics, Barcelona, Spain, 2020, pp. 251–265.
[7]
Y. P. Du, M. He, L. L. Wang, and H. T. Zhang, Wasserstein based transfer network for cross-domain sentiment classification, Know. Based Syst., vol. 204, p. 106162, 2020.
[8]
C. N. Du, H. F. Sun, J. Y. Wang, Q. Qi, and J. X. Liao, Adversarial and domain-aware BERT for cross-domain sentiment analysis, in Proc. 58th Ann. Meeting of the Association for Computational Linguistic, 2020, pp. 4019–4028.
[9]
Q. M. Xue, W. Zhang, and H. Y. Zha, Improving domain-adapted sentiment classification by deep adversarial mutual learning, in Proc. 34th AAAI Conf. on Artificial Intelligence, New York, NY, USA, 2020, pp. 9362–9369.
[10]
R. Sharma, P. Bhattacharyya, S. Dandapat, and H. S. Bhatt, Identifying transferable information across domains for cross-domain sentiment classification, in Proc. 56th Ann. Meeting of the Association for Computational Linguistics, Melbourne, Australia, 2018, pp. 968–978.
[11]
X. Y. Qu, Z. K. Zou, Y. Cheng, Y. Yang, and P. Zhou, Adversarial category alignment network for cross-domain sentiment classification, in Proc. 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, MN, USA, 2019, pp. 2496–2508.
[12]
M. L. Peng, Q. Zhang, Y. G. Jiang, and X. J. Huang, Cross-domain sentiment classification with target domain specific information, in Proc. 56th Ann. Meeting of the Association for Computational Linguistics, Melbourne, Australia, 2018, pp. 2505–2513.
[13]
I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, Generative adversarial nets, in Proc. 27th Int. Conf. on Neural Information Processing Systems, Cambridge, MA, USA, 2014, pp. 2672–2680.
[14]
M. Bouazizi and T. Ohtsuki, Multi-class sentiment analysis on twitter: Classification performance and challenges, Big Data Mining and Analytics, vol. 2, no. 3, pp. 181–194, 2019.
[15]
Y. Bie and Y. Yang, A multitask multiview neural network for end-to-end aspect-based sentiment analysis, Big Data Mining and Analytics, vol. 4, no. 3, pp. 195–207, 2021.
[16]
S. C. Peng, L. H. Cao, Y. M. Zhou, Z. H. Ouyang, A. M. Yang, X. G. Li, W. J. Jia, and S. Yu, A survey on deep learning for textual emotion analysis in social networks, Digit. Commun. Netw., .
[17]
D. Kulshreshtha, P. Goel, and A. K. Singh, How emotional are you? Neural architectures for emotion intensity prediction in microblogs, in Proc. 27th Int. Conf. on Computational Linguistics, Santa Fe, NM, USA, 2018, pp. 2914–2926.
[18]
E. Mohammadi, H. Amini, and L. Kosseim, Neural feature extraction for contextual emotion detection, in Proc. Int. Conf. on Recent Advances in Natural Language Processing, Varna, Bulgaria, 2019, pp. 785–794.
[19]
M. Abdul-Mageed and L. Ungar, Emonet: Fine-grained emotion detection with gated recurrent neural networks, in Proc. 55th Ann. Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017, pp. 718–728.
[20]
S. Tafreshi and M. Diab, Emotion detection and classification in a multigenre corpus with joint multi-task deep learning, in Proc. 27th Int. Conf. on Computational Linguistics, Santa Fe, NM, USA, 2018, pp. 2905–2913.
[21]
J. F. Yu, L. Marujo, J. Jiang, P. Karuturi, and W. Brendel, Improving multi-label emotion classification via sentiment classification with dual attention transfer network, in Proc. 2018 Conf. on Empirical Methods in Natural Language Processing, Brussels, Belgium, 2018, pp. 1097–1102.
[22]
Z. Ahmad, R. Jindal, A. Ekbal, and P. Bhattachharyya, Borrow from rich cousin: Transfer learning for emotion detection using cross lingual embedding, Exp. Syst. Appl., vol. 139, p. 112851, 2020.
[23]
C. L. P. Chen and Z. L. Liu, Broad learning system: An effective and efficient incremental learning system without the need for deep architecture, IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 1, pp. 10–24, 2018.
[24]
S. Feng and C. L. P. Chen, Nonlinear system identification using a simplified fuzzy broad learning system: Stability analysis and a comparative study, Neurocomputing, vol. 337, pp. 274–286, 2019.
[25]
D. X. Zhong and F. G. Liu, RF-OSFBLS: An RFID reader-fault-adaptive localization system based on online sequential fuzzy broad learning system, Neurocomputing, vol. 390, pp. 28–39, 2020.
[26]
T. Zhang, X. H. Wang, X. M. Xu, and C. L. P. Chen, GCB-Net: Graph convolutional broad network and its application in emotion recognition, IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 379–388, 2022.
[27]
J. T. Lin, Z. Liu, C. L. P. Chen, and Y. Zhang, Three-domain fuzzy wavelet broad learning system for tremor estimation, Know. Based Syst., vol. 192, p. 105295, 2019.
[28]
W. K. Yu and C. H. Zhao, Broad convolutional neural network based industrial process fault diagnosis with incremental learning capability, IEEE Trans. Ind. Electron., vol. 67, no. 6, pp. 5081–5091, 2020.
[29]
S. C. Peng, R. Zeng, H. Z. Liu, G. H. Chen, R. H. Wu, A. M. Yang, and S. Yu, Emotion classification of text based on BERT and broad learning system, in Proc. 5th The Asia-Pacific Web (APWeb) and Web-Age Information Management (WAIM) Joint Int. Conf. on Web and Big Data, Guangzhou, China, 2021, pp. 382–396.
[30]
K. M. Borgwardt, A. Gretton, M. J. Rasch, H. P. Kriegel, B. Schlkopf, and A. J. Smola, Integrating structured biological data by kernel maximum mean discrepancy, Bioinformatics, vol. 22, no. 14, pp. 49–57, 2006.
[31]
M. Schuster and K. K. Paliwal, Bidirectional recurrent neural networks, IEEE Trans. Signal Process., vol. 45, no. 11, pp. 2673–2681, 1997.
[32]
Y. Kim, Convolutional neural networks for sentence classification, in Proc. 2014 Conf. on Empirical Methods in Natural Language Processing, Doha, Qatar, 2014, pp. 1746–1751.
[33]
Y. Ziser and R. Reichart, Pivot based language modeling for improved neural domain adaptation, in Proc. 2018 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans, LA, USA, 2018, pp. 1241–1251.
[34]
Y. M. Cui, W. X. Che, T. Liu, B. Qin, and Z. Q. Yang, Pre-training with whole word masking for Chinese BERT, IEEE/ACM Trans. Audio Speech Language Process., vol. 29, pp. 3504–3514, 2021.
[35]
I. Loshchilov and F. Hutter, Decoupled weight decay regularization, presented at the 7th Int. Conf. on Learning Representations, New Orleans, LA, USA, 2019.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 22 December 2021
Revised: 15 March 2022
Accepted: 17 March 2022
Published: 29 September 2022
Issue date: April 2023

Copyright

© The author(s) 2023.

Acknowledgements

This work was partially supported by the National Natural Science Foundation of China (No. 61876205), the Natural Science Foundation of Guangdong (No. 2021A1515012652), and the Science and Technology Program of Guangzhou (No. 2019050001).

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return