Complaint Classification using Word2Vec Model

  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract

    Attempt has been made to develop a versatile, universal complaint grievance segregator by classifying orally acknowledged grievances

    into one of the predefined categories. The oral complaints are first converted to text and then each word is represented by a vector using

    word2vec. Each grievance is represented by a single vector using Gated Recurrent Unit (GRU) that implements the hidden state of Recurrent Neural Network (RNN) model. The popular Multi-Layer Perceptron (MLP) has been used as the classifier to identify the categories.


  • Keywords

    Gated Recurrent Unit; Recurrent Neural Network; Text Classification; Word2Vec

  • References

      [1] R. Collobert and J. Wetson, Fast semantic extraction using a novel neural network architecture. Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics,, pp. 560-567, 2007.

      [2] F. Sebastiani, Machine learning in automated text categorization. ACM Computing Surveys (CSUR), vol. 34, pp. 1-47, 2002.

      [3] Yoshua Bengio, Rjean Ducharme, Pascal Vincent, Christian Jauvin, A Neural Probabilistic Language Model. Journal of Machine Learning Research 3 (2003) 11371155

      [4] D. Bhandari and P. S. Ghosh, Parametric representation of paragraphs and their classification. In Proc. 2nd Int. Conf. on International Conference on Advanced Computing, Networking and Informatics (ICACNI-2014), Springer, 179-186, Kolkata, 2014.

      [5] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams Learning representations by backpropagating errors. Nature, 323(6088):533536, 1986.

      [6] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient estimation of word representations in vector space. CoRR, abs/1301.3781,2013

      [7] Tomas Mikolov, Ilya Sutskever, Kai Chen, Gregory S. Corrado, and Jeffrey Dean, Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems 26: 27th Annual Conference on Neural Information Processing Systems 2013. Proceedings of a meeting held December 5-8, 2013, Lake Tahoe, Nevada, United States, pages 31113119, 2013.

      [8] Tomas Mikolov, Wen-tau Yih, Geoffrey Zweig Linguistic Regularities in Continuous Space Word Representations Proceedings of NAACL-HLT 2013, pages 746751,Atlanta, Georgia, 914 June 2013

      [9] J. Chung, C. Gulcehre, K. Cho, Y. Bengio, Empirical evaluation of gated recurrent neural networks on sequence modeling. CoRR, abs/1412.3555, 2014.

      [10] D. Bahdanau, K. Cho, and Y. Bengio, Neural machine translation by jointly learning to align and translate. Technical report, arXiv preprint arXiv:1409.0473, 2014.

      [11] Diederik P. Kingma and Jimmy Ba, Adam: A Method for Stochastic Optimization. arXiv:1412.6980 [cs.LG], December 2014.

      [12] Mike Schuster and Kuldip K. Paliwal, Bidirectional Recurrent Neural Networks., IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 45, NO. 11, NOVEMBER 1997




Article ID: 20192
DOI: 10.14419/ijet.v7i4.5.20192

Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.