Full Hand Pose Recognition in Performing Daily Activities for Tele-Rehabilitation based on Decision Tree Algorithm

Authors

  • Nurul Shafiqah Haja Salim Department of Mechatronics Engineering, International Islamic University Malaysia, Jalan Gombak, 53100 Kuala Lumpur, Malaysia
  • Norsinnira Zainul Azlan Department of Mechatronics Engineering, International Islamic University Malaysia, Jalan Gombak, 53100 Kuala Lumpur, Malaysia
  • Hafizu Ibrahim Hassan Department of Mechatronics Engineering, Ahmadu Bello University Zaria, PMB 8987, Nigeria
  • Anis Nurashikin Nordin Department of Electrical and Computer Engineering, International Islamic University Malaysia, Jalan Gombak, 53100 Kuala Lumpur, Malaysia
  • Sajjad Hosen Department of Electrical and Computer Engineering, International Islamic University Malaysia, Jalan Gombak, 53100 Kuala Lumpur, Malaysia

DOI:

https://doi.org/10.15282/mekatronika.v6i1.10187

Keywords:

Hand Pose Recognitiion, Decision Tree, Machine Learning, Data Collection, Tele- Rehabilitation

Abstract

The older population has the highest risk of getting a stroke, leading to a high healthcare cost and a heavy economic burden to the nation. Tele-rehabilitation aids to enhance the life of stroke survivors by allowing them to conduct the therapy from home which helps the patient with low mobility and living far from the medical centers. This work focuses on the development of full hand pose recognition in performing daily activities for tele-rehabilitation treatment using Decision Tree algorithm under the Machine Learning. Force sensor, flexible sensors and MPU6050 Micro Electro-Mechanical system (MEMS) are used for the data collection. The sensors’ resistance and acceleration are the input to the Machine Learning algorithm and the type of hand pose acts as the output. Three hand gesture procedure are chosen in this study, which are grasping a glass, turning the pipe and switching on the plug. The procedure for data collection has been devised. The Decision Tree has been trained and tested using Python programming language on Jupyter Notebook web-based interactive computing platform. At this stage of study, tests are conducted with healthy subjects to validate the effectiveness of the proposed recognition system. An accuracy of 94%. has been achieved. The sensor readings show different patterns of the curves for each activity. This project will assist the medical staffs in delivering a better treatment for the patients and will lead to a faster recovery process.

References

S.Y.A. Mounis, N.Z. Azlan and F. Sado, “Assist-as-needed control strategy for upper-limb rehabilitation based on subject’s functional ability,” Measurement and Control, vol. 52, issue 9-10, pp. 1354-1361, 2019.

S.Y.A. Mounis, N.Z. Azlan and F. Sado, “Progress based assist-as-needed control strategy for upper-limb rehabilitation”, Proc. of the IEEE Conference on Systems, Process and Control (ICSPC), pp. 65-70, 2017.

J. K. Deters and Y. Rybarczyk, “Hidden Markov Model Approach for the Assessment of Tele-Rehabilitation Exercises,” International Journal of Artificial Intelligence, vol. 16, no. 1, pp. 1-19, 2018.

G. Rogez, J. S. Supancic and D. Ramanan, “Egocentric Pose Recognition in Four Lines of Code,” https://arxiv.org/abs/1412.0060, 2014.

A. Rashid and O. Hasan, “Wearable technologies for hand joints monitoring for rehabilitation. A survey,” Microelectronics J, vol. 88, pp. 173–183, 2019, doi. 10.1016/j.mejo.2018.01.014.

Z. Tian, J. Wang, X. Yang and M. Zhou, “WiCatch. A wi-fi based hand gesture recognition system,” IEEE Access, vol. 6, pp. 16911–16923, 2018, doi. 10.1109/ACCESS.2018.2814575.

R. Rastgoo, K. Kiani and S. Escalera, “Multi-modal deep hand sign language recognition in still images using Restricted Boltzmann Machine,” Entropy, vol. 20, no. 11, 2018, doi. 10.3390/e20110809.

O. Glauser, S. Wu, D. Panozzo, O. Hilliges and O. Sorkine-Hornung, “Interactive hand pose estimation using a stretch-sensing soft glove,” ACM Trans Graph, vol. 38, no. 4, 2019, doi. 10.1145/3306346.3322957.

C. M. M. Refat, N. Z. Azlan, “Stretch Sensor-Based Facial Expression Recognition and Classification Using Machine Learning,” International Journal of Computational Intelligence and Applications, vol. 20, issue 02, article number 2150010, 2021.

C. M. M. Refat, N. Z. Azlan, “Deep learning methods for facial expression recognition,” Proc. of the 7th International Conference on Mechatronics Engineering (ICOM), pp. 1-6, 2019.

R. M. Stephenson, R. Chai, D. Eager, “Isometric finger pose recognition with sparse channel spatio temporal EMG imaging, “Annual International Conf IEEE Engineering in Medicine and Biology Society,” pp. 5232-5235, 2018, doi: 10.1109/EMBC.2018.8513445.

D. S. Tran, N. H. Ho, H. J. Yang, E. T. Baek, S. H. Kim and G. Lee, “Real-time hand gesture spotting and recognition using RGB-D camera and 3D convolutional neural network,” Applied Sciences, vol. 10, no. 2, article number 722, 2020, doi. 10.3390/app10020722.

A.R. Asif, A. Waris, S. O. Gilani, M. Jamil, H. Ashraf, M. Shafique and I. K. Niazi, “Performance evaluation of Convolutional Neural Network for hand gesture recognition using EMG,” Sensors, vol. 20, article number 1642, 2020, https://doi.org/10.3390/s20061642.

L. I. Khalaf, S. A. Aswad, S. R. Ahmed, B. Makki and M. R. Ahmed, “Survey on recognition hand gesture by using Data Mining agorithms,” Proc. of the 4th International Congress on Human-Computer Interaction, Optimization and Robotic Applications, pp 1-4, 2022, doi. 10.1109/HORA55278.2022.9800090.

A. Abdulhussein and F. Raheem, “Hand gesture recognition of static letters American Sign Language (ASL) using Deep Learning,” Engineering and Technology Journal, vol. 38, no. 6, pp. 926–937, 2020, doi. 10.30684/etj.v38i6a.533.

Y. Li and P. Zhang, “Static hand gesture recognition based on hierarchical decision and classification of finger features,” Science Progress, vol. 105, no. 1, 2022, doi. 10.1177/00368504221086362.

S. Bakheet and A. Al-Hamadi, “Robust hand gesture recognition using multiple shape-oriented visual cues,” EURASIP Journal on Image and Video Processing, vol. 2021, no. 1, article number 26, 2021, doi. 10.1186/s13640-021-00567-1.

D. Naglot and M. Kulkarni, “Real time sign language recognition using the leap motion controller,” Proceedings of the International Conference on Inventive Computation Technologies, pp. 1-5, 2016, doi. 10.1109/INVENTIVE.2016.7830097.

P. N. Tan, M. Steinbach, and V. Kumar "Introduction to Data Mining," Pearson Education India, 2016.

V. R. Konasani, S. Kadre, “Machine Learning and Deep Learning Using Python and TensorFlow,” 1st Edition, McGraw Hill, 2021.

Downloads

Published

2024-05-05

How to Cite

[1]
N. S. . Haja Salim, N. Zainul Azlan, H. I. . Hassan, A. N. . Nordin, and S. . Hosen, “Full Hand Pose Recognition in Performing Daily Activities for Tele-Rehabilitation based on Decision Tree Algorithm ”, Mekatronika: J. Intell. Manuf. Mechatron., vol. 6, no. 1, pp. 81–91, May 2024.

Issue

Section

Original Article