Journal Home > Volume 3 , Issue 4

As an important branch of natural language processing, sentiment analysis has received increasing attention. In teaching evaluation, sentiment analysis can help educators discover the true feelings of students about the course in a timely manner and adjust the teaching plan accurately and timely to improve the quality of education and teaching. Aiming at the inefficiency and heavy workload of college curriculum evaluation methods, a Multi-Attention Fusion Modeling (Multi-AFM) is proposed, which integrates global attention and local attention through gating unit control to generate a reasonable contextual representation and achieve improved classification results. Experimental results show that the Multi-AFM model performs better than the existing methods in the application of education and other fields.


menu
Abstract
Full text
Outline
About this article

Multi-Attention Fusion Modeling for Sentiment Analysis of Educational Big Data

Show Author's information Guanlin ZhaiYan Yang( )Heng WangShengdong Du
School of Information Science and Technology, National Engineering Laboratory of Integrated Transportation Big Data Application Technology, Southwest Jiaotong University, Chengdu 611756, China

Abstract

As an important branch of natural language processing, sentiment analysis has received increasing attention. In teaching evaluation, sentiment analysis can help educators discover the true feelings of students about the course in a timely manner and adjust the teaching plan accurately and timely to improve the quality of education and teaching. Aiming at the inefficiency and heavy workload of college curriculum evaluation methods, a Multi-Attention Fusion Modeling (Multi-AFM) is proposed, which integrates global attention and local attention through gating unit control to generate a reasonable contextual representation and achieve improved classification results. Experimental results show that the Multi-AFM model performs better than the existing methods in the application of education and other fields.

Keywords: attention, sentiment analysis, educational big data, aspect-level

References(30)

[1]
N. Bousbia and I. Belamri, Which contribution does EDM provide to computer-based learning environments? in Educational Data Mining, A. Peña-Ayala, ed. Springer, 2014, pp. 3-28.
DOI
[2]
G. Wolfgang and D. Hendrik, Translating learning into numbers: A generic framework for learning analytics, Educational Technology & Society, vol. 15, no. 3, pp. 42-57, 2012.
[3]
V. Kalakoski, H. Ratilainen, and L. Drupsteen, Enhancing learning at work. How to combine theoretical and data-driven approaches, and multiple levels of data? in Proc. 23rd European Symp. Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges, Belgium, 2015, pp. 331-336.
[4]
J. Z. Liao, J. Y. Tang, and X. Zhao, Course drop-out prediction on MOOC platform via clustering and tensor completion, Tsinghua Science and Technology, vol. 24, no. 4, pp. 412-422, 2019.
[5]
T. Nasukawa and J. Yi, Sentiment analysis: Capturing favorability using natural language processing, in Proc. 2nd Int. Conf. Knowledge Capture, Sanibel Island, FL, USA, 2003, pp. 70-77.
DOI
[6]
B. Liu, Sentiment analysis and opinion mining, Synthesis Lectures on Human Language Technologies, vol. 5, no. 1, pp. 1-167, 2012.
[7]
B. Pang, L. Lee, and S. Vaithyanathan, Thumbs up?: Sentiment classification using machine learning techniques, in Proc. ACL-02 Conf. Empirical Methods in Natural Language Processing, Philadelphia, PA, USA, 2002, pp. 79-86.
DOI
[8]
S. Hochreiter and J. Schmidhuber, Long short-term memory, Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997.
[9]
X. J. Zhou, X. J. Wan, and J. G. Xiao, Attention-based LSTM network for cross-lingual sentiment classification, in Proc. 2016 Conf. Empirical Methods in Natural Language Processing, Austin, TX, USA, 2016, pp. 247-256.
DOI
[10]
Y. Q. Wang, M. L. Huang, X. Y. Zhu, and L. Zhao, Attention-based LSTM for aspect-level sentiment classification, in Proc. 2016 Conf. Empirical Methods in Natural Language Processing, Austin, TX, USA, 2016, pp. 606-615.
DOI
[11]
D. H. Ma, S. J. Li, X. D. Zhang, and H. F. Wang, Interactive attention networks for aspect-level sentiment classification, in Proc. 26th Int. Joint Conf. Artificial Intelligence, Melbourne, Australia, 2017, pp. 4068-4074.
DOI
[12]
B. Liu, S. J. Tang, X. G. Sun, Q. Y. Chen, J. X. Cao, J. Z. Luo, and S. S. Zhao, Context-aware social media user sentiment analysis, Tsinghua Science and Technology, vol. 25, no. 4, pp. 528-541, 2020.
[13]
X. Han, B. Y. Li, and Z. R. Wang, An attention-based neural framework for uncertainty identification on social media texts, Tsinghua Science and Technology, vol. 25, no. 1, pp. 117-126, 2020.
[14]
K. Xu, J. L. Ba, R. Kiros, K. Cho, A. Courville, R. Salakhutdinov, R. S. Zemel, and Y. Bengio, Show, attend and tell: Neural image caption generation with visual attention, in Proc. 32nd Int. Conf. Machine Learning, Lille, France, 2015, pp. 2048-2057.
[15]
A. M. Rush, S. Chopra, and J. Weston, A neural attention model for abstractive sentence summarization, in Proc. 2015 Conf. Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 379-389.
DOI
[16]
X. Li, L. D. Bing, W. Lam, and B. Shi, Transformation networks for target-oriented sentiment classification, in Proc. 56th Annu. Meeting of the Association for Computational Linguistics, Melbourne, Australia, 2018, pp. 946-956
DOI
[17]
M. T. Luong, H. Pham, and C. D. Manning, Effective approaches to attention-based neural machine translation, in Proc. 2015 Conf. Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 1412-1421.
DOI
[18]
K. H. Chen, R. Wang, M. Utiyama, E. Sumita, and T. J. Zhao, Syntax-directed attention for neural machine translation, in Proc. 32nd AAAI Conf. Artificial Intelligence, New Orleans, LA, USA, 2018, pp. 4792-4799.
[19]
R. D. He, W. S. Lee, H. T. Ng, and D. Dahlmeier, Effective attention modeling for aspect-level sentiment classification, in Proc. 27th Int. Conf. Computational Linguistics, Santa Fe, NM, USA, 2018, pp. 1121-1131.
DOI
[20]
S. Wang, S. Mazumder, B. Liu, M. W. Zhou, and Y. Chang, Target-sensitive memory networks for aspect sentiment classification, in Proc. 56th Annu. Meeting of the Association for Computational Linguistics, Melbourne, Australia, 2018, pp. 957-967.
DOI
[21]
J. W. Duan, X. Ding, and T. Liu, Learning sentence representations over tree structures for target-dependent classification, in Proc. 2018 Conf. North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans, LA, USA, 2018, pp. 551-560.
DOI
[22]
J. J. Wang, J. Li, S. S. Li, Y. Y. Kang, M. Zhang, L. Si, and G. D. Zhou, Aspect sentiment classification with both word-level and clause-level attention networks, in Proc. 27th Int. Joint Conf. Artificial Intelligence, Stockholm, Sweden, 2018, pp. 4439-4445.
DOI
[23]
H. Choi, K. Cho, and Y. Bengio, Context-dependent word representation for neural machine translation, Computer Speech & Language, vol. 45, pp. 149-160, 2017.
[24]
D. P. Kingma and J. Ba, Adam: A method for stochastic optimization, in Proc. 3rd Int. Conf. Learning Representations, San Diego, CA, USA, 2015.
[25]
S. Li, Z. Zhao, R. F. Hu, W. S. Li, T. Liu, and X. Y. Du, Analogical reasoning on chinese morphological and semantic relations, in Proc. 56th Annu. Meeting of the Association for Computational Linguistics, Melbourne, Australia, 2018, pp. 138-143.
DOI
[26]
J. Pennington, R. Socher, and C. Manning, GloVe: Global vectors for word representation, in Proc. 2014 Conf. Empirical Methods in Natural Language Processing, Doha, Qatar, 2014, pp. 1532-1543.
DOI
[27]
D. Y. Tang, B. Qin, X. C. Feng, and T. Liu, Effective LSTMs for target-dependent sentiment classification, in Proc. 26th Int. Conf. Computational Linguistics: Technical Papers, Osaka, Japan, 2016, pp. 3298-3307.
[28]
P. Chen, Z. Q. Sun, L. D. Bing, and W. Yang, Recurrent attention network on memory for aspect sentiment analysis, in Proc. 2017 Conf. Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 2017, pp. 452-461.
DOI
[29]
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł Kaiser, and I. Polosukhin, Attention is all you need, in Proc. 31st Int. Conf. Neural Information Processing Systems, Long Beach, CA, USA, 2017, pp. 6000-6010.
[30]
M. Bouazizi and T. Ohtsuki, Multi-class sentiment analysis on twitter: Classification performance and challenges, Big Data Mining and Analytics, vol. 2, no. 3, pp. 181-194, 2019.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 30 August 2020
Accepted: 30 September 2020
Published: 16 November 2020
Issue date: December 2020

Copyright

© The authors 2020

Acknowledgements

The research work was partially supported by the National Natural Science Foundation of China (No. 61976247) and Southwest Jiaotong University Education Reform Project (No. 20201010).

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return