Semantic based neural model approach for text simplification

  • Authors

    • Hemanth Somasekar RNS Institute of Technology
    • Dr Kavya Naveen RNS Institute of Technology
    2018-07-08
    https://doi.org/10.14419/ijet.v7i3.13291
  • Use About Five Key Words or Phrases in Alphabetical Order, Separated by Semicolon
  • The machine translation systems affect by various difficulties like long-distance dependency and long sentences having complex syntax. Text Summarization (TeSu) and Text Simplification (TeSi) are the important ways of simplifying the text for users who are having the poor reading capability, including non-native speakers, functionally illiterate and children. TeSu produce a brief summary of the main ideas of the text, while TeSi aims to reduce the linguistic complexity of the text and retain the original meaning. In many text generation tasks, sequence-to-sequence model depends upon approaches of TeSu and TeSi achieves more success, recently. Text data have low Semantic Relevance (SR), but the Simplified Text (SiTs) which generate from the Source Text (SoTs) are more similar. The goal of the paper is to work for TeSu and TeSi, for improving the SR between the original texts and the modified texts. The proposed method encouraging high semantic similarity between texts and summaries by implementing SR based Neural model (SRN). The encoder represents the SoT, whereas, the decoder produced the summary representation. During training, the representation provides maximum Similarity Score (SS) and the experi-ments conducted on the approach using two benchmark datasets. The experimental results showed that SNR approach provided better per-formance compared to the existing method in terms of metrics such as readability metrics, human-sentence level evaluation, and Post Editing (PE) evaluation.

     

  • References

    1. [1] A Ou YY, Kuan TW, Paul A, Wang JF, & Tsai AC (2017), “Spoken dialog summarization system with HAPPINESS/SUFFERING factor recognition,†Frontiers of Computer Science, Vol. 11, No. 3, pp. 429-443. https://doi.org/10.1007/s11704-016-6190-2.

      [2] Singh J, Singh G, Singh R, & Singh P (2018), “Morphological Evaluation and Sentiment Analysis of Punjabi Text using Deep Learning Classification,†Journal of King Saud University-Computer and Information Sciences. https://doi.org/10.1016/j.jksuci.2018.04.003.

      [3] Zhang S, Wei Z, Wang Y, & Liao T (2018), “Sentiment analysis of Chinese micro-blog text based on extended sentiment dictionary,†Future Generation Computer Systems, Vol. 81, pp. 395-403. 8 https://doi.org/10.1016/j.future.2017.09.048.

      [4] Zhang Y, Xia Y, Liu Y, & Wang W (2015), “Clustering sentences with density peaks for multi-document summarization,†Proceedings of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1262-1267.

      [5] Canhasi E, & Kononenko I (2016), “Weighted hierarchical archetypal analysis for multi-document summarization,†Computer Speech & Language, Vol. 37, pp. 24-46. https://doi.org/10.1016/j.csl.2015.11.004.

      [6] Canhasi E, & Kononenko I (2014), “Multi-document summarization via archetypal analysis of the content-graph joint model,†Knowledge and information systems, Vol. 41, no. 3, pp. 821-842. https://doi.org/10.1007/s10115-013-0689-8.

      [7] Raposo F, Ribeiro R, & de Matos DM (2015), “On the application of generic summarization algorithms to music,†IEEE Signal Processing Letters, vol. 22, no. 1, pp. 26-30. https://doi.org/10.1109/LSP.2014.2347582

      [8] Bhargava R, Sharma Y, & Sharma G (2016), “Atssi: Abstractive text summarization using sentiment infusion,†Procedia Computer Science, vol. 89, pp. 404-411. https://doi.org/10.1016/j.procs.2016.06.088.

      [9] Chen MY, Huang TC, Shu Y, Chen CC, Hsieh TC, & Yen NY (2018), “Learning the Chinese Sentence Representation with LSTM Autoencoder,†Proceedings of the Companion of the Web Conference, pp. 403-408. https://doi.org/10.1145/3184558.3186355.

      [10] Yahaya MF, Rahman NA, Bakar ZA, & Hasmy H (2017), “Evaluation On Knowledge Extraction and Machine Learning in Resolving Malay Word Ambiguity,†Journal of Fundamental and Applied Sciences, Vol. 9, no. 5S, pp. 115-130. https://doi.org/10.4314/jfas.v9i5s.10.

      [11] Eddington CM, & Tokowicz N (2015), “How meaning similarity influences ambiguous word processing: The current state of the literature,†Psychonomic bulletin & review, Vol. 22, no. 1, pp. 13-37. https://doi.org/10.3758/s13423-014-0665-7.

      [12] Gautam G, & Yadav D (2014), “Sentiment analysis of twitter data using machine learning approaches and semantic analysis,†Proceedings of the Contemporary computing (IC3), pp. 437-442.

      [13] Xia Y, Cambria E, Hussain A, & Zhao H (2015), “Word polarity disambiguation using bayesian model and opinion-level features,†Cognitive Computation, Vol. 7, no. 3, pp. 369-380. https://doi.org/10.1007/s12559-014-9298-4.

      [14] Wang L, Niu J, Song H, & Atiquzzaman M (2018), “SentiRelated: A cross-domain sentiment classification algorithm for short texts through sentiment related index,†Journal of Network and Computer Applications, Vol. 101, pp. 111-119. https://doi.org/10.1016/j.jnca.2017.11.001.

      [15] Abdi A, Shamsuddin SM, & Aliguliyev RM, “QMOS: Query-based multi-documents opinion-oriented summarization,†Information Processing & Management, Vol. 54, no. 2, pp. 318-338.

      [16] Nagwani NK (2015), “Summarizing large text collection using topic modeling and clustering based on MapReduce framework,†Journal of Big Data, Vol. 2, no. 1, pp. 6. https://doi.org/10.1186/s40537-015-0020-5.

      [17] Xiong S, Wang K, Ji D, & Wang B (2018), “A short text sentiment-topic model for product reviews,†Neurocomputing, Vol. 297, pp. 94-102. https://doi.org/10.1016/j.neucom.2018.02.034.

      [18] Song S, Huang H, & Ruan T (2018), “Abstractive text summarization using LSTM-CNN based deep learning,†Multimedia Tools and Applications, pp. 1-19.

      [19] Mandya AA, Nomoto T, & Siddharthan A (2014), “Lexico-syntactic text simplification and compression with typed dependencies,†Proceedings of the 25th International Conference on Computational Linguistics.

      [20] Štajner S, & Glavaš G (2017), “Leveraging event-based semantics for automated text simplification,†Expert systems with applications, Vol. 82, pp. 383-395. https://doi.org/10.1016/j.eswa.2017.04.005.

  • Downloads

  • How to Cite

    Somasekar, H., & Kavya Naveen, D. (2018). Semantic based neural model approach for text simplification. International Journal of Engineering & Technology, 7(3), 1366-1371. https://doi.org/10.14419/ijet.v7i3.13291