An accessible and affordable mobile application for tablets for people with severe disabilities who can only communicate using eye tracking. This tool will allow users to improve their quality of life by offering communication features with loved ones, monitoring patient care, and accessing information in an inclusive and effective manner. It will utilize integration technologies with open-source libraries such as OpenCV and Pupil Lab for eye tracking and artificial intelligence for machine learning to recognize behavior or reactions to express pain, anxiety, stress, and other symptoms, and notify caregivers or family members.
People with severe disabilities and conditions that affect mobility are difficult to support for families with limited financial resources. This is a critical and underserved need in the United States: the social and technological integration of people with severe disabilities, who often face significant barriers to their ability to fully participate in society. By developing an application based on artificial intelligence and eye-tracking technology, I offer an innovative solution that will transform communication and care for this vulnerable population. This proposal will not only improve the quality of life of beneficiaries by allowing them to interact effectively with their environment, but will also optimize care processes for caregivers and healthcare professionals, promoting more efficient and less costly care.