The Preprocessing for Predicting of Physical Activity Recognition

  • Authors

    • Sakchai Muangsrinoon
    • Poonpong Boonbrahm
    2018-11-27
    https://doi.org/10.14419/ijet.v7i4.19.22089
  • Accelerometer, Android Wear, Preprocessing, Multiclass classification, Physical activity recognition.
  • This experiment examined the preprocessing for predicting of physical activity recognition model to access the relationship between time duration of sensors, the single tri-axial accelerometer, and fitness recognition (sitting, standing, walking, and running). The experimented with sixteen students (62.5% male and 37.5% female, age between eighteen through twenty-three year old) of the Informatics school at Walailak University. The authors had the experimental setup with the split dataset, 80% for training and testing, and 20% for validation, and repeated k-fold Cross-Validation (number=10, repeats=3) for resampling method to evaluate model performance for baseline models. When the authors measured model’s performance, the authors found the follows results. First – the raw dataset with 123,156 samples, the best models performance has accuracy level with KNN: k-Nearest Neighbor and RF: Random Forest is 100%. Second – the aggregate dataset time duration 1 second with 1,240 samples, the best models performance has accuracy level with RF: Random Forest is 100%.Third – the aggregate dataset time duration 5 seconds with 251 samples, the best models performance has accuracy level with RF: Random Forest is 99.5%. Fourth – the aggregate dataset time duration 10 seconds with 128 samples, the best models performance has accuracy level with KNN: k-Nearest Neighbor is 96.82%. Fifth – the aggregate dataset time duration 15 seconds with 86 samples, the best models performance has accuracy level with KNN: k-Nearest Neighbor is 96. 21%.Sixth – the aggregate dataset time duration 20 seconds with 66 samples, the best models performance has accuracy level with LDA: Linear Discriminant Analysis is 98%.Seventh – the aggregate dataset time duration 25 seconds with 54 samples, the best models performance has accuracy level with KNN: k-Nearest Neighbor is 96.33%. Moreover, finally, Eight – the aggregate dataset time duration 30 seconds with 46 samples, the best models performance has accuracy level with KNN: k-Nearest Neighbor is 93.61%. In the future work, the authors planned to get more accuracy model by adding more features from another sensor, heart rate. Mining data collected from sensors provide valuable result in the physical activity recognition area. The improvement in performance is required especially in the healthcare field. The more increasing of using the wearable device, the wider opportunity in the data mining research area can be.

     

     

  • References

    1. [1] AbrielFilios, Sotiris Nikoletseas, and Christina Pavlopoulou. 2015. Efficient Parameterized Methods for Physical Activity Detection using only Smartphone Sensors. In Proceedings of the 13th ACM International Symposium on Mobility Management and Wireless Access (MobiWac '15). ACM, New York, NY, USA, 97-104. DOI=http://dx.doi.org/10.1145/2810362.2810372

      [2] Aftab Khan, Sebastian Mellor, Eugen Berlin, Robin Thompson, Roisin McNaney, Patrick Olivier, and Thomas Plötz. 2015. Beyond activity recognition: skill assessment from accelerometer data. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, New York, NY, USA, 1155-1166. DOI: http://dx.doi.org/10.1145/2750858.2807534

      [3] Alejandro Baldominos, Carmen del Barrio, and YagoSaez. 2016. Exploring the Application of Hybrid Evolutionary Computation Techniques to Physical Activity Recognition. In Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion (GECCO '16 Companion), Tobias Friedrich (Ed.). ACM, New York, NY, USA, 1377-1384. DOI: https://doi.org/10.1145/2908961.2931732

      [4] Alejandro Baldominos, YagoSaez, and Pedro Isasi. 2015. Feature Set Optimization for Physical Activity Recognition Using Genetic Algorithms. In Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation (GECCO Companion '15), Sara Silva (Ed.). ACM, New York, NY, USA, 1311-1318. DOI: http://dx.doi.org/10.1145/2739482.2768506

      [5] Allan Stisen, Henrik Blunck, Sourav Bhattacharya, Thor SiigerPrentow, MikkelBaunKjærgaard, AnindDey, Tobias Sonne, and Mads Møller Jensen. 2015. Smart Devices are Different: Assessing and MitigatingMobile Sensing Heterogeneities for Activity Recognition. In Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems (SenSys '15). ACM, New York, NY, USA, 127-140. DOI: http://dx.doi.org/10.1145/2809695.2809718

      [6] Andreas Bulling, Ulf Blanke, and Bernt Schiele. 2014. A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 46, 3, Article 33 (January 2014), 33 pages. DOI=http://dx.doi.org/10.1145/2499621

      [7] Attila Reiss, GustafHendeby, and Didier Stricker. 2015. A novel confidence-based multiclass boosting algorithm for mobile physical activity monitoring. Personal Ubiquitous Comput. 19, 1 (January 2015), 105-121. DOI: http://dx.doi.org/10.1007/s00779-014-0816-x

      [8] Benyue Su, Qingfeng Tang, Jing Jiang, Min Sheng, Ali Abdullah Yahya, and Guangjun Wang. 2016. A novel method for short-time human activity recognition based on improved template matching technique. In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry - Volume 1 (VRCAI '16), Vol. 1. ACM, New York, NY, USA, 233-242. DOI: https://doi.org/10.1145/3013971.3014004

      [9] Dan Morris, T. Scott Saponas, Andrew Guillory, and Ilya Kelner. 2014. RecoFit: using a wearable sensor to find, recognize, and count repetitive exercises. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 3225-3234. DOI: https://doi.org/10.1145/2556288.2557116

      [10] Daniel Castro, Steven Hickson, Vinay Bettadapura, Edison Thomaz, Gregory Abowd, Henrik Christensen, and Irfan Essa. 2015. Predicting daily activities from egocentric images using deep learning. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15). ACM, New York, NY, USA, 75-82. DOI: https://doi.org/10.1145/2802083.2808398

      [11] Ellis, K., Godbole, S., Kerr, J., &Lanckriet, G. (2014). Multi-sensor physical activity recognition in free-living. Proceedings of the ... ACM International Conference on Ubiquitous Computing .UbiComp (Conference), 2014, 431–440. http://doi.org/10.1145/2638728.2641673

      [12] Fabio Aiolli, Matteo Ciman, Michele Donini, and OmbrettaGaggi. 2014. ClimbTheWorld: real-time stairstep counting to increase physical activity. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (MOBIQUITOUS '14). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), ICST, Brussels, Belgium, Belgium, 218-227. DOI: http://dx.doi.org/10.4108/icst.mobiquitous.2014.258013

      [13] Gerald Bieber, Thomas Kirste, and Michael Gaede. 2014. Low sampling rate for physical activity recognition. In Proceedings of the 7th International Conference on Pervasive Technologies Related to Assistive Environments (PETRA '14). ACM, New York, NY, USA, Article 15, 8 pages. DOI: https://doi.org/10.1145/2674396.2674446

      [14] Giancarlo Fortino, Raffaele Gravina, Wenfeng Li, and Congcong Ma. 2015. Using cloud-assisted body area networks to track people physical activity in mobility. In Proceedings of the 10th EAI International Conference on Body Area Networks (BodyNets '15). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), ICST, Brussels, Belgium, Belgium, 85-91. DOI=http://dx.doi.org/10.4108/eai.28-9-2015.2261424

      [15] HaodongGuo, Ling Chen, Yanbin Shen, and Gencai Chen. 2014. Activity recognition exploiting classifier level fusion of acceleration and physiological signals. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 63-66. DOI=http://dx.doi.org/10.1145/2638728.2638777

      [16] Heba Aly and Mohamed A. Ismail. 2015. ubiMonitor: intelligent fusion of body-worn sensors for real-time human activity recognition. In Proceedings of the 30th Annual ACM Symposium on Applied Computing (SAC '15). ACM, New York, NY, USA, 563-568. DOI: http://dx.doi.org/10.1145/2695664.2695912

      [17] Henrik Blunck, Sourav Bhattacharya, Allan Stisen, Thor SiigerPrentow, MikkelBaunKjærgaard, AnindDey, Mads Møller Jensen, and Tobias Sonne. 2016. ACTIVITY RECOGNITION ON SMART DEVICES: Dealing with diversity in the wild. GetMobile: Mobile Comp. and Comm. 20, 1 (July 2016), 34-38. DOI: http://dx.doi.org/10.1145/2972413.2972425

      [18] Isabel Suarez, Andreas Jahn, Christoph Anderson, and Klaus David. 2015. Improved activity recognition by using enriched acceleration data. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, New York, NY, USA, 1011-1015. DOI: http://dx.doi.org/10.1145/2750858.2805844

      [19] Jafet Morales, David Akopian, Physical activity recognition by smartphones, a survey, Biocybernetics and Biomedical Engineering, Volume 37, Issue 3, 2017, Pages 388-400, ISSN 0208-5216, https://doi.org/10.1016/j.bbe.2017.04.004

      [20] Jeffrey W. Lockhart and Gary M. Weiss. 2014. Limitations with activity recognition methodology & data sets. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 747-756. DOI=http://dx.doi.org/10.1145/2638728.2641306

      [21] Jie Wan, Michael J. O'grady, and Gregory M. O'hare. 2015. Dynamic sensor event segmentation for real-time activity recognition in a smart home context. Personal Ubiquitous Comput. 19, 2 (February 2015), 287-301. DOI: http://dx.doi.org/10.1007/s00779-014-0824-x

      [22] Lijie Xu and Kikuo Fujimura. 2014. Real-Time Driver Activity Recognition with Random Forests. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14). ACM, New York, NY, USA, , Article 9 , 8 pages. DOI=http://dx.doi.org/10.1145/2667317.2667333

      [23] Mikio Obuchi, Wataru Sasaki, Tadashi Okoshi, JinNakazawa, and Hideyuki Tokuda. 2016. Investigating incorruptibility at activity breakpoints using smartphone activity recognition API. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, New York, NY, USA, 1602-1607. DOI: https://doi.org/10.1145/2968219.2968556

      [24] Natalie Jablonsky, Sophie McKenzie, Shaun Bangay, and Tim Wilkin. 2017. Evaluating sensor placement and modality for activity recognition in active games. In Proceedings of the Australasian Computer Science Week Multiconference (ACSW '17). ACM, New York, NY, USA, Article 61, 8 pages. DOI: https://doi.org/10.1145/3014812.3014875

      [25] Rahul Majethia, AkshitSinghal, Lakshmi Manasa K, KunchaySahiti, Shubhangi Kishore, and Vijay Nandwani. 2016. AnnoTainted: Automating Physical Activity Ground Truth Collection Using Smartphones. In Proceedings of the 3rd International on Workshop on Physical Analytics (WPA '16). ACM, New York, NY, USA, 13-18. DOI: http://dx.doi.org/10.1145/2935651.2935653

      [26] Saez, Y., Baldominos, A., &Isasi, P. (2017). A Comparison Study of Classifier Algorithms for Cross-Person Physical Activity Recognition. Sensors (Basel, Switzerland), 17(1), 66. http://doi.org/10.3390/s17010066

      [27] Sara Khalifa. 2017. Energy-efficient human activity recognition for self-powered wearable devices. In Proceedings of the Australasian Computer Science Week Multiconference (ACSW '17). ACM, New York, NY, USA, Article 78, 2 pages. DOI: https://doi.org/10.1145/3014812.3018840

      [28] Thomas Phan. 2014. Improving activity recognition via automatic decision tree pruning. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 827-832. DOI=http://dx.doi.org/10.1145/2638728.2641310

      [29] Vijay Rajanna, Raniero Lara-Garduno, Dev JyotiBehera, KarthicMadanagopal, Daniel Goldberg, and Tracy Hammond. 2014. Step up life: a context aware health assistant. In Proceedings of the Third ACM SIGSPATIAL International Workshop on the Use of GIS in Public Health (HealthGIS '14), Daniel W. Goldberg, Ori Gudes, and YaronKanza (Eds.). ACM, New York, NY, USA, 21-30. DOI: https://doi.org/10.1145/2676629.2676636

      [30] Wenchao Jiang and Zhaozheng Yin. 2015. Human Activity Recognition Using Wearable Sensors by Deep Convolutional Neural Networks. In Proceedings of the 23rd ACM international conference on Multimedia (MM '15). ACM, New York, NY, USA, 1307-1310. DOI: https://doi.org/10.1145/2733373.2806333

      [31] Xing Su, Hanghang Tong, and Ping Ji. 2014. Accelerometer-based Activity Recognition on Smartphone. In Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management (CIKM '14). ACM, New York, NY, USA, 2021-2023. DOI=http://dx.doi.org/10.1145/2661829.2661836

  • Downloads

  • How to Cite

    Muangsrinoon, S., & Boonbrahm, P. (2018). The Preprocessing for Predicting of Physical Activity Recognition. International Journal of Engineering & Technology, 7(4.19), 349-354. https://doi.org/10.14419/ijet.v7i4.19.22089