Analysis of Dynamic Topic Modeling for Textual ‎Data Using A Novel Approach of Hybrid Deep Learning ‎with NMF and NTD

  • Authors

    • C. B. Pavithra Research Scholar, Department of Information Technology, Dr. N. G. P. Arts & Science College, ‎Coimbatore, Tamil Nadu, India
    • Dr. J. Savitha Professor, Department of Information Technology, Dr. N. G. P. Arts & Science College, ‎ Coimbatore, Tamil Nadu, India‎
    https://doi.org/10.14419/5t43s482

    Received date: July 14, 2025

    Accepted date: August 28, 2025

    Published date: September 12, 2025

  • Dynamic Topic Modeling; Real-Time Textual Data Analysis; Non-Negative Matrix Factorization; ‎Supervised NMF; Non-Negative Tucker Decomposition; Hybrid Deep Learning; Advanced Topic ‎Modeling for Research Articles 2.0 and Evaluation Metrics
  • Abstract

    In this research, we conduct an in-depth analysis of dynamic topic modeling techniques for real-time and evolving textual data, leveraging methods such as Non-Negative Matrix Factorization ‎‎(NMF), Supervised NMF (SNMF), Non-Negative Tucker Decomposition (NTD), and hybrid ‎models, Hybrid HDP-CT-DTM, Hybrid DTM-RNN, and proposed Hybrid Convolutional Neural ‎Networks (CNNs) with NMF and NTD. Using the "Advanced Topic Modeling for Research ‎Articles 2.0" dataset, which comprises 14,000 documents, this research paper evaluates the ‎effectiveness of these methods based on perplexity, coherence, precision, recall, F-score, and ‎accuracy. Our findings indicate that the proposed hybrid CNN with NTD model outperforms ‎other techniques across all evaluation metrics, demonstrating superior ability in capturing complex ‎topic structures and maintaining high accuracy. This performance is attributed to the rich feature ‎extraction capabilities of CNNs and the higher-order interaction modeling provided by NTD. ‎This research work highlights the potential of advanced hybrid models for enhancing the quality ‎and interpretability of topic models in dynamic and large-scale textual datasets‎.

  • References

    1. Huang, J., Du, X., & Xia, L. (2019). "Dynamic topic modeling with multi-level topic correlation for text streams", IEEE Access, vol. 7, pp. 110829-110841. https://doi.org/10.1109/ACCESS.2019.2927345.
    2. Ren, Y., Yang, B., & Lu, Y. (2020). "Dynamic Topic Modeling Using Variational Inference." In 2020 IEEE International Conference on Big Data (Big Data), pp. 261-270. https://doi.org/10.1109/BigData50022.2020.9378400.
    3. Zhang, C., & Zhai, C. (2019). "A Robust Probabilistic Model for Scientific Topic Evolution." In 2019 IEEE International Conference on Data Min-ing (ICDM), pp. 1448-1453. https://doi.org/10.1109/ICDM.2019.00197.
    4. Li, J., Liu, H., & Zhao, T. (2020). "Coherence-based Optimal Topic Number in Topic Modeling." In 2020 IEEE International Conference on Big Data (Big Data), pp. 137-142. https://doi.org/10.1109/BigData50022.2020.9378373.
    5. C.B.Pavithra, J.Savitha, "Advancements in Dynamic Topic Modeling: A Comparative analysis of LDA, DTM, GIBBSLDA++, HDP and Proposed Hybrid Model HDP with CT-DTM for real-time and evolving textual data", Journal of Theoretical and Applied Information Technology, May 2024. Vol.102. No. 10,ISSN: 1992-8645,pp-5344-5360.
    6. Lin, Y., Sun, M., & Ma, L. (2019). "Dynamic Neural Topic Model with Word Embeddings." In 2019 IEEE International Conference on Data Min-ing (ICDM), pp. 1054-1059. https://doi.org/10.1109/ICDM.2019.00122.
    7. Gao, Y., Luo, Y., Sun, Q., & Zhang, W. (2019). "Enhanced non-negative matrix factorization via l2,1-norm minimization", IEEE Transactions on Neural Networks and Learning Systems, vol. 30, no. 3, pp. 817-829.
    8. Li, Z., Jiang, C., Liu, W., & Luo, B. (2018). "Robust non-negative matrix factorization with structured outliers." IEEE Transactions on Neural Net-works and Learning Systems, vol. 29, no. 10, pp. 4660-4673. https://doi.org/10.1109/TNNLS.2017.2691725.
    9. Gu, S., Li, L., Wu, X., & Wu, S. (2018). "Adaptive learning for robust non-negative matrix factorization." IEEE Transactions on Neural Networks and Learning Systems, vol. 29, no. 10, pp. 4638-4651.
    10. Wang, Y., Yu, L., & Pan, Z. (2020). "Non-Negative Matrix Factorization: A Comprehensive Review." IEEE Access, vol. 8, pp. 70296-70314.
    11. Zhang, C., Luo, D., & Nie, F. (2018). "Supervised Non-Negative Matrix Factorization with a Locality Preserving Constraint." IEEE Transactions on Image Processing, vol. 27, no. 9, pp. 4425-4437.
    12. Li, X., Sha, W., Huang, Y., & Wei, Z. (2020). "Supervised Non-Negative Matrix Factorization with Discriminative Feature Selection for Hyper-spectral Unmixing." IEEE Geoscience and Remote Sensing Letters, vol. 17, no. 6, pp. 1074-1078.
    13. Liu, J., Chen, J., & Hu, Q. (2019). "Supervised Dual-Regularized Non-Negative Matrix Factorization for Text Classification." IEEE Access, vol. 7, pp. 82612-82623.
    14. Li, L., Zhang, D., Wu, X., & Wu, S. (2018). "Non-Negative Tucker Decomposition for Big Sparse Data." IEEE Transactions on Big Data, vol. 4, no. 3, pp. 374-386.
    15. Jiang, H., Qi, G., & Xu, F. (2020). "Incremental Non-Negative Tucker Decomposition for Large-Scale Tensor Data." IEEE Access, vol. 8, pp. 22655-22667. https://doi.org/10.1109/ACCESS.2020.2969428.
    16. Zhang, Q., Liu, Z., & Bai, L. (2019). "Non-Negative Tucker Decomposition Based on Block Coordinate Descent Method." IEEE Access, vol. 7, pp. 97780-97791. https://doi.org/10.1109/ACCESS.2019.2930355.
    17. Zhang, Y., Wang, H., & Shi, Y. (2020). "A Non-Negative Tucker Decomposition-Based Approach for Recommendation System." In 2020 IEEE International Conference on Data Mining (ICDM), pp. 1370-1375. https://doi.org/10.1109/ICDM50108.2020.00179.
    18. Zhang, Y., & Wallace, B. (2017). "A Sensitivity Analysis of and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classifica-tion." In 2017 IEEE International Conference on Data Mining (ICDM), pp. 1165-1170. https://doi.org/10.1109/ICDM.2017.156.
    19. Gehring, J., Auli, M., Grangier, D., Yarats, D., & Dauphin, Y. N. (2017). "Convolutional sequence to sequence learning." In Proceedings of the 34th International Conference on Machine Learning-Volume 70 (pp. 1243-1252). JMLR. org.
    20. Zhang, Y., & Wallace, B. (2017). "Sensitivity of convolutional neural networks to input distribution and depth." In Proceedings of the 2017 Con-ference on Empirical Methods in Natural Language Processing (pp. 1103-1112). https://doi.org/10.18653/v1/D17-1112.
  • Downloads

  • How to Cite

    Pavithra, C. B. ., & Savitha, D. J. . (2025). Analysis of Dynamic Topic Modeling for Textual ‎Data Using A Novel Approach of Hybrid Deep Learning ‎with NMF and NTD. International Journal of Basic and Applied Sciences, 14(5), 361-378. https://doi.org/10.14419/5t43s482