Generic placeholder image

Recent Advances in Computer Science and Communications

Editor-in-Chief

ISSN (Print): 2666-2558
ISSN (Online): 2666-2566

General Research Article

Emotion Recognition in Reddit Comments Using Recurrent Neural Networks

Author(s): Mahdi Rezapour*

Volume 17, Issue 4, 2024

Published on: 15 December, 2023

Article ID: e151223224560 Pages: 9

DOI: 10.2174/0126662558273325231201051141

Price: $65

Abstract

Background: Reddit comments are a valuable source of natural language data where emotion plays a key role in human communication. However, emotion recognition might be a difficult task that requires understanding the context and sentiment of the texts. In this paper, we aim to compare the effectiveness of four Recurrent Neural Network (RNN) models for classifying the emotions of Reddit comments.

Methods: We use a small dataset of 4,922 comments labeled with four emotions: approval, disapproval, love, and annoyance. We also use pre-trained Glove.840B.300d embeddings as the input representation for all models. The models we compare are SimpleRNN, Long Shortterm Memory (LSTM), bidirectional LSTM, and Gated Recurrent Unit (GRU). We experiment with different text preprocessing steps, such as removing stopwords and applying stemming, removing negation from stopwords, and the effect of setting the embedding layer as trainable on the models.

Results: We find that GRU outperforms all other models, achieving an accuracy of 74%. Bidirectional LSTM and LSTM are close behind, while SimpleRNN performs the worst. We observe that the low accuracy is likely due to the presence of sarcasm, irony, and complexity in the texts. We also notice that setting the embedding layer as trainable improves the performance of LSTM but increases the computational cost and training time significantly. We analyze some examples of misclassified texts by GRU and identify the challenges and limitations of the dataset and the models.

Conclusion: In our study GRU was found to be the best model for emotion classification of Reddit comments among the four RNN models we compared. We also discuss some future directions for research to improve the emotion recognition task on Reddit comments. Furthermore, we provide an extensive discussion of the applications and methods behind each technique in the context of the paper.

Graphical Abstract

[1]
S. J. Pan, and Q. Yang, "A survey on transfer learning", IEEE Transactions on Knowledge and Data Engineering, vol. 22, pp. 1345-1359, 2009.
[2]
P.H. Shahana, and B. Omman, "Evaluation of features on sentimental analysis", Procedia Comput. Sci., vol. 46, pp. 1585-1592, 2015.
[http://dx.doi.org/10.1016/j.procs.2015.02.088]
[3]
P. Beineke, T. Hastie, and S. Vaithyanathan, "The sentimental factor: Improving review classification via human-provided information", In Proceedings of the 42nd Annual Meeting of the Association for Computational Linguistics (ACL-04), 2004, pp. 263-270
[http://dx.doi.org/10.3115/1218955.1218989]
[4]
D. Zhang, M. Hong, L. Zou, F. Han, F. He, Z. Tu, and Y. Ren, "Attention pooling-based bidirectional gated recurrent units model for sentimental classification", Int. J. Comp. Intelli. Syst., vol. 12, no. 2, p. 723, 2019.
[http://dx.doi.org/10.2991/ijcis.d.190710.001]
[5]
M. Swamynathan, Mastering machine learning with python in six steps: A practical implementation guide to predictive data analytics using python., Apress, 2019.
[http://dx.doi.org/10.1007/978-1-4842-4947-5]
[6]
S. Minaee, N. Kalchbrenner, E. Cambria, N. Nikzad, M. Chenaghlu, and J. Gao, "Deep learning-based text classification: A comprehensive review", ACM Comput. Surv., vol. 54, no. 3, pp. 1-40, 2022.
[http://dx.doi.org/10.1145/3439726]
[7]
L. Bing, Sentiment analysis and opinion mining (synthesis lectures on human language technologies)., University of Illinois: Chicago, IL, USA, 2012.
[8]
C. Zhou, C. Sun, Z. Liu, and F. Lau, "A C-LSTM neural network for text classification", arXiv, 2015.
[9]
L. Zhang, S. Wang, and B. Liu, "Deep learning for sentiment analysis: A survey", Wiley Interdiscip. Rev. Data Min. Knowl. Discov., vol. 8, no. 4, p. e1253, 2018.
[http://dx.doi.org/10.1002/widm.1253]
[10]
J.L. Elman, "Finding structure in time", Cogn. Sci., vol. 14, no. 2, pp. 179-211, 1990.
[http://dx.doi.org/10.1207/s15516709cog1402_1]
[11]
K. Cho, B. Van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, and H. Schwenk, "Learning phrase representations using RNN encoder-decoder for statistical machine translation", arXiv, 2014.
[http://dx.doi.org/10.3115/v1/D14-1179]
[12]
S. Hochreiter, and J. Schmidhuber, "Long short-term memory", Neural Comput., vol. 9, no. 8, pp. 1735-1780, 1997.
[http://dx.doi.org/10.1162/neco.1997.9.8.1735] [PMID: 9377276]
[13]
M. Schuster, and K.K. Paliwal, "Bidirectional recurrent neural networks", IEEE Trans. Signal Process., vol. 45, no. 11, pp. 2673-2681, 1997.
[http://dx.doi.org/10.1109/78.650093]
[14]
D. Demszky, D. Movshovitz-Attias, J. Ko, A. Cowen, G. Nemade, and S. Ravi, "GoEmotions: A dataset of fine-grained emotions", arXiv, 2020.
[http://dx.doi.org/10.18653/v1/2020.acl-main.372]
[15]
J. Pennington, R. Socher, and C.D. Manning, "Glove: Global vectors for word representation", In Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), 2014, pp. 1532-1543
[http://dx.doi.org/10.3115/v1/D14-1162]
[16]
R.C. Staudemeyer, and E.R. Morris, "Understanding LSTM-a tutorial into long short-term memory recurrent neural networks", arXiv, 2019.

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy