Artificial Intelligence(AI) Enhanced Eye Tracking System For Tetraplegia Patients

Authors

  • Farhaan N Department of Computer Science and Engineering, K.Ramakrishnan College of Engineering college, Tamil Nadu, India Author
  • Arshathul Mohamed Haq B Department of Computer Science and Engineering, K.Ramakrishnan College of Engineering college, Tamil Nadu, India Author
  • Bogar S Department of Computer Science and Engineering, K.Ramakrishnan College of Engineering college, Tamil Nadu, India Author
  • Nithya N Assistant Professor, Department of Computer Science and Engineering, K.Ramakrishnan College of Engineering college, Tamil Nadu, India Author

DOI:

https://doi.org/10.47392/irjash.2023.S044

Keywords:

Locked in Syndrome, Eye gaze movement, Tetraplegic patient, Convolutional Neural Network

Abstract

People who are incapable of managing all of their muscles other than their head and eyes. There are some medical conditions, such as Locked in Syndrome, that can induce paralysis or motor speech disorders in persons, which can result in voice or speech impairments. Traditionally, many people can communicate by tracking their movements and blinking their eyes. Communication is important for allowing people of this type to express how they feel and what they need. Using an IOT module and an eye tracking system based on CNN (Convolutional Neural Network), we design a technique of interpersonal communication for a tetraplegic patient in this system. The use of eye blink detection and movement tracking for communication is also possible for quadriplegic patients. With a voice board speaker and this project concept, talks can be delivered while imparting all necessary information. The goal of this research was to develop an IOT module that would enable tetraplegic people to send emergency information to the care team using deep learning and digital image processing. Following the end of the interaction, the technology will be able to flawlessly display the audio signals from the gaze movement.

Downloads

Published

2023-05-28