Gradient Boosting Decision Tree Classification-Based FacialEmotion ‎Detection Using Machine Learning

  • Authors

    • Ms. K. Rajeswari Assistant Professor, Department of Computer Science Engineering, Mahendra Engineering College, Namakkal, Tamil Nadu, India
    • S. Tharaneedharan PG Student, Department of Computer Science Engineering, Mahendra Engineering College, Namakkal, Tamil Nadu, India‎
    https://doi.org/10.14419/nx916614

    Received date: July 15, 2025

    Accepted date: July 24, 2025

    Published date: November 1, 2025

  • Contrast Limited Adaptive Histogram Equalization; Elephant Herding optimization; Gradient Boosting Decision Tree; Matthews ‎Correlation Coefficient.
  • Abstract

    Facial Emotion Detection is an automated task of computer vision used to determine the emotion of the human face based on facial ‎expressions in still images or video. It is done by extracting the facial features and labeling them as happy, sad, angry, or surprised. It finds ‎universal application in human-computer interaction, mental condition examination, and surveillance mechanisms. Achieving high levels of ‎accuracy in detecting Facial Emotion in systems using different types of lighting and occlusions is quite a challenge that current systems fail ‎to achieve. They also have the problem of generalizing due to different facial constructions, age groups, and cultural expressions. To solve ‎those problemsContrast Limited Adaptive Histogram Equalization (CLAHE) is used to pre-process facial expressions in images or videos. ‎This improves the local contrast and reveals subtle emotional guidance by boosting brightness in small areas and restricting noise expansion. ‎Further, Elephant Herding optimization (EHO) is used to optimize the learning process, find all similar features of the face, and select the ‎most essential features in the case of feedback facial recognition. The Gradient Boosting Decision Tree (GBDT), with its effective input in ‎the non-linear relation and amelioration in the accuracy of the prediction, was used in carrying out the classification task. The performance ‎indicators are used to evaluate the model, including accuracy89.3%, precision87.2%, recall86.5%, specificity88.1%, AUC-ROC of 0.91, and ‎Matthews Correlation Coefficient (MCC) of 0.84. Experimental findings indicate that the designed approach reaches the peak tenth in the ratio ‎of the support rate of dominant emotion categories and regularly exceeds the classical schemes in most severe and complicated facial‎recognition contexts, which indicates high reliability and resilience of the regimen in various challenging tasks‎.

  • References

    1. Neelakandan, S., & Paulraj, D. (2020). A gradient boosted decision tree-based sentiment classification of twitter data. International Journal of Wavelets, Multiresolution and Information Processing, 18(04), 2050027.
    2. Sun, R., Wang, G., Zhang, W., Hsu, L. T., & Ochieng, W. Y. (2020). A gradient boosting decision tree based GPS signal reception classification algorithm. Applied Soft Computing, 86, 105942.
    3. Priyadarshini, R. K., Banu, A. B., & Nagamani, T. (2019, April). Gradient boosted decision tree based classification for recognizing human behav-ior. In 2019 International Conference on Advances in Computing and Communication Engineering (ICACCE) (pp. 1-4). IEEE.
    4. Rodrigues Filho, L. F., & Martins, A. S. (2023). EMOTION DETECTION USING MACHINE LEARNING: ASSESSMENT OF EFFECTIVE-NESS AND APPLICABILITY IN DIFFERENT CONTEXTS. Humanidades & Inovação, 10(21), 273-284.
    5. Arsirii, O. O., & Petrosiuk, D. V. (2023). Pseudo-labeling of transfer learning convolutional neural network data for human facial emotion recogni-tion. Вісник сучасних інформаційних технологій, 6(3), 203-214.
    6. Jaiswal, S., & Nandi, G. C. (2020). Hyperparameters optimization for Deep Learning based emotion prediction for Human Robot Interaction. arXiv preprint arXiv:2001.03855.
    7. Goyal, K. Multimodal Attention CNN for Human Emotion Recognition.
    8. Chakraborty, A., & Nagar, A. K. Emotion Recognition from Facial Action Points by Principal Component Analysis.
    9. Karali, A., Bassiouny, A., & El-Saban, M. (2015, September). Facial expression recognition in the wild using rich deep features. In 2015 IEEE In-ternational Conference on Image Processing (ICIP) (pp. 3442-3446). IEEE.
    10. Uyulan, Ç., Gümüş, A. E., & Güleken, Z. (2022). EEG-induced Fear-type Emotion Classification Through Wavelet Packet Decomposition, Wave-let Entropy, and SVM. Hittite Journal of Science and Engineering, 9(4), 241-251.
    11. Marques, L. S., Gresse von Wangenheim, C., & Hauck, J. C. (2020). Teaching machine learning in school: A systematic mapping of the state of the art. Informatics in Education, 19(2), 283-321.
    12. Ahn, H., Kim, S., & Kim, J. K. (2014). GA-optimized support vector regression for an improved emotional state estimation model. KSII Transac-tions on Internet & Information Systems, 8(6).
    13. YIN, H., LIU, P., & ZHANG, L. (2025). Preliminary development and evaluation of the Chinese self-conscious emotions nonverbal behavior ex-pression stimulus set, and its application in research. Acta Psychologica Sinica, 57(10), 1745.
    14. Zhang, Z., & Jung, C. (2020). GBDT-MO: Gradient-boosted decision trees for multiple outputs. IEEE transactions on neural networks and learn-ing systems, 32(7), 3156-3167.
    15. Si, S., Zhang, H., Keerthi, S. S., Mahajan, D., Dhillon, I. S., & Hsieh, C. J. (2017, July). Gradient boosted decision trees for high dimensional sparse output. In International conference on machine learning (pp. 3182-3190). PMLR.
    16. Joshi, M. L., & Kanoongo, N. (2022). Depression detection using emotional artificial intelligence and machine learning: A closer review. Materials Today: Proceedings, 58, 217-226.
    17. Elbawab, M., & Henriques, R. (2023). Machine Learning applied to student attentiveness detection: Using emotional and non-emotional measures. Education and information technologies, 28(12), 15717-15737.
    18. Lei, Y., & Cao, H. (2023). Audio-visual emotion recognition with preference learning based on intended and multi-modal perceived labels. IEEE Transactions on Affective Computing, 14(4), 2954-2969.
    19. Martinez-Martin, E., & Fernández-Caballero, A. (2025). Improved human emotion recognition from body and hand pose landmarks on the GE-MEP dataset using machine learning. Expert Systems with Applications, 269, 126427.
    20. Nandini, D., Yadav, J., Rani, A., & Singh, V. (2024). Enhancing emotion detection with non-invasive multi-channel EEG and hybrid deep learning architecture. Iranian Journal of Science and Technology, Transactions of Electrical Engineering, 48(3), 1229-1248.
    21. Siddiqui, H. U. R., Zafar, K., Saleem, A. A., Raza, M. A., Dudley, S., Rustam, F., & Ashraf, I. (2023). Emotion classification using temporal and spectral features from IR-UWB-based respiration data. Multimedia Tools and Applications, 82(12), 18565-18583.
    22. Shilaskar, S., Bobby, D., Dusane, A., & Bhatlawande, S. (2023, June). Fusion of EEG, EMG, and ECG signals for accurate recognition of pain, happiness, and disgust. In 2023 International Conference in Advances in Power, Signal, and Information Technology (APSIT) (pp. 142-149). IEEE.
    23. Saher, A., Gilanie, G., Cheema, S., Latif, A., Batool, S. N., & Ullah, H. (2024). A Deep Learning-Based Automated Approach of Schizophrenia Detection from Facial Micro-Expressions. Intelligent Automation & Soft Computing, 39(6).
  • Downloads

  • How to Cite

    Rajeswari, M. K., & Tharaneedharan, S. . (2025). Gradient Boosting Decision Tree Classification-Based FacialEmotion ‎Detection Using Machine Learning. International Journal of Basic and Applied Sciences, 14(SI-1), 606-614. https://doi.org/10.14419/nx916614