Integrated Hand and Eye Communication Device for Intensive Care Unit (ICU) Patients


  • Nur Adlina Mohd Zamri Department of Mechatronics Engineering, Kulliyyah of Engineering, International Islamic University Malaysia, Jalan Gombak, 53100 Kuala Lumpur, Malaysia
  • Norsinnira Zainul Azlan Department of Mechatronics Engineering, Kulliyyah of Engineering, International Islamic University Malaysia, Jalan Gombak, 53100 Kuala Lumpur, Malaysia
  • Mohd Basri Mat Nor Department of Anaesthesiology and Intensive Care, Kulliyyah of Medicine, International Islamic University Malaysia, Jalan Sultan Ahmad Shah, Bandar Indera Mahkota, 25200 Kuantan, Pahang, Malaysia



Intensive Care Unit (ICU) Patients, Hand Communication Device, Eye Communication Device, Hand Gesture Detection, Eye Movement Tracking


Numerous Intensive Care Unit (ICU) patients may be unable to speak or move their body due to being intubated or weak muscles. This makes the communication harder, as they  are voiceless in expressing their thoughts and needs. This study focuses on the development of an integrated hand and eye  communication device based on hand gesture recognition and eye movement tracking with an output display. The device comes with a dual mode, where the hand gesture and eye movement are detected using flex sensors and reflectance sensors respectively. Arduino Uno is used as the microcontroller and the output window is programmed using Java programming language. Four messages communicated in ICU patients are installed on the system.  The device has been  tested at the laboratory stage with healthy subjects. The results with healthy individuals in the laboratory validate that the device is successful in conveying the intended messages correctly for all trials. The resulting sensors measurement curves are consistent across all subjects and messages. The developed device will contribute towards a better communication between the patients and healthare providers, leading to a more convinient and efficient patient care.


J. L. Guttormson, K. L. Bremer, and R. M. Jones, “‘Not being able to talk was horrid’: A descriptive, correlational study of communication during mechanical ventilation,” Intensive and Critical Care Nursing, vol. 31, no. 3, pp. 179–186, Jun. 2015, doi: 10.1016/j.iccn.2014.10.007.

RA Al-Khulaidi, R Akmeliawati, NZ Azlan, NHA Bakr, NM Fauzi, “Development of robotic hands of signbot, advanced Malaysian sign-language performing robot, Advances in robotics research 2 (3), 183, 2018.

N. Mohamed, M. B. Mustafa and N. Jomhari, "A Review of the Hand Gesture Recognition System: Current Progress and Future Directions," in IEEE Access, vol. 9, pp. 157422-157436, 2021, doi: 10.1109/ACCESS.2021.3129650.

M. Al-Hammadi, G. Muhammad, W. Abdul, M. Alsulaiman, M. A. Bencherif and M. A. Mekhtiche, "Hand Gesture Recognition for Sign Language Using 3DCNN," in IEEE Access, vol. 8, pp. 79491-79509, 2020, doi: 10.1109/ACCESS.2020.2990434.

B. G. Lee, V. C. Tran, and T. W. Chong, “Smart hand device gesture recognition with dynamic time-warping method,” in ACM International Conference Proceeding Series, Dec. 2017, pp. 216–219. doi: 10.1145/3175684.3175697.

V. Gajul, P. Sharangdhar, S. Shinde, and S. S. Pawar, “Wireless Assistive Communication System for Speech Impaired Person,” in Lecture Notes in Networks and Systems, vol. 34, Springer, 2018, pp. 95–104. doi: 10.1007/978-981-10-8198-9_10.

S. Fayyaz, R. Bukhsh, M. A. Khan, H. A. Hamza Gondal, and S. Tahir, “Adjustment of Bed for a Patient Through Gesture Recognition: An Image Processing Approach,” Proceedings of the 21st International Multi Topic Conference, INMIC 2018, Dec. 2018, doi: 10.1109/INMIC.2018.8595453.

S. M. A. Hoque, M. S. Haq, and M. Hasanuzzaman, “Computer Vision Based Gesture Recognition for Desktop Object Manipulation,” 2018 International Conference on Innovation in Engineering and Technology, ICIET 2018, Mar. 2019, doi: 10.1109/CIET.2018.8660916.

H. You et al., "EyeCoD: Eye Tracking System Acceleration via FlatCam-Based Algorithm and Hardware Co-Design," in IEEE Micro, vol. 43, no. 4, pp. 88-97, July-Aug. 2023, doi: 10.1109/MM.2023.3274736.

H. Zhang, S. Wu, W. Chen, Z. Gao and Z. Wan, "Self-Calibrating Gaze Estimation with Optical Axes Projection for Head-Mounted Eye Tracking," in IEEE Transactions on Industrial Informatics, doi: 10.1109/TII.2023.3276322.

S. I. Khan and R. B. Pachori, "Automated Eye Movement Classification Based on EMG of EOM Signals Using FBSE-EWT Technique," in IEEE Transactions on Human-Machine Systems, vol. 53, no. 2, pp. 346-356, April 2023, doi: 10.1109/THMS.2023.3238113.

J. K. Kim, J. Park, Y. -K. Moon and S. -J. Kang, "Improving Gaze Tracking in Large Screens With Symmetric Gaze Angle Amplification and Optimization Technique," in IEEE Access, vol. 11, pp. 85799-85811, 2023, doi: 10.1109/ACCESS.2023.3282185.

Z. O. Abu-Faraj, M. J. Mashaalany, H. C. B. Sleiman, J. L. D. Heneine, and W. M. al Katergi, “Design and development of a low-cost eye tracking system for the rehabilitation of the completely locked-in patient,” in Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, 2006, pp. 4905–4908. doi: 10.1109/IEMBS.2006.260280.

R. Garand, A. Jackson, N. Marquart, L. Sapharas, and D. Vorac, “The Eye Tracking System Eye Tracking System,” 2009, Ohio State University, Team B.I.T. Report.




How to Cite

N. A. . Mohd Zamri, N. Zainul Azlan, and M. B. . Mat Nor, “Integrated Hand and Eye Communication Device for Intensive Care Unit (ICU) Patients”, Mekatronika: J. Intell. Manuf. Mechatron., vol. 6, no. 1, pp. 53–65, May 2024.



Original Article