A.I. for

Accessibility

Hackathon

This competition is aimed at rallying talents and fostering the regional development of the innovative entrepreneurship community related to artificial intelligence while also increasing social inclusiveness. Join our LinkedIn event page to learn more and ideate.

Join our live event today, June 26, at 3:00 pm (Beirut Time)

According to the World Health Organization, 1 billion individuals, or 15% of the world population, are considered to have disabilities which exclude them from education and the workplace, weaken their potential to connect and communicate effectively, and limit their ability for independent living. With the shift to online living, more barriers arise and the need for inclusiveness and accessibility is stronger than ever.

Through the advancement of assistive technologies, new possibilities have opened up to overcome barriers to basic living activities. Artificial Intelligence (AI) plays a crucial role in the development of assistive technologies, enhancing an inclusive user experience and removing barriers. AI advances such as predictive text, visual recognition and speech-to-text transcription are already showing enormous potential for helping people with vision, hearing, cognitive, learning, mobility disabilities – as well as a range of mental health conditions.

Tracks

This hackathon provides an opportunity to compete on two separate tracks:

1.

Assistive Technology

Artificial intelligence offers huge potential in helping people with vision, hearing, mobility, cognition and learning disabilities, whether temporary or permanent, to lead a better quality of life. AI-powered assistive technologies have proven to offer significant benefits for persons with disabilities (PWDs), especially in terms of human connection, everyday life in the modern world and employment. All this is possible through accessibility tools, digital accessibility, and inclusive design.

When such design incorporates AI, it has the power to enhance assistive products through powerful features. In fact, one area where AI and machine learning have made a significant contribution is access to information, such that disabilities do not limit the opportunities for education, employment, and a better life. With AI-enabled tools, no one gets left behind.

Examples include:

– Video captioning – Audio books
– Live captioning/translation/interpretation – Text to speech & speech to text
– AI-based visual aids – Smarter glasses
– Cognitive hearing aid – Sign to text
– Adaptive teaching and learning software  

and other solutions to enhance opportunities for education and for equal opportunities employment, etc.

2.

Physical Aids

Artificial intelligence can also provide physical aid and mobility.  AI researchers joined efforts with multiple industries to change the life for person with temporary or permanent physical disabilities being motor, visual, or auditory.

Examples are many:

In robotics, AI supported physical aids examples include:

  • a prosthetic limb movement that uses artificial intelligence to mimic the motion of the user’s residual leg, making the act of walking smoother and more intuitive. Instead of relying on preprogrammed movements to drive their prosthetic it uses adaptive AI that could better learn in the moment how to move.
  • the soft, assistive robotic glove for debilitated patients. With the help of this AI product, individuals can perform actions like bending their fingers, rotating, and twisting thumbs, grasping, and a few other movements of their hands. The aim of this innovation, as described by lead investigator and assistant professor at Wyss Institute, Connor Walsh, is “to restore independence for people who have lost the ability to grasp.”
  • Intel and Brown University also have active projects working on connecting Brain and Spine, aiming at employing AI technology to restore movement for patients paralyzed by critical spinal cord injuries.

Another form of physical aids is enhanced mobility, applications include

  • AI enhanced wheelchairs that work by facial expressions or are
  • self-driving based on eye tracking
  • self-driving cars aimed at people with disabilities.
  • vehicles, like the Concept i-RIDE, though not self-driving, that create more automotive independence for individuals who use a wheelchair.

AI also revolutionized smart homes to enable independent living through

  • voice technology used to control lights, adjust the heat, or air conditioning
  • smart home hubs that learn your routine over time and anticipate your needs,
  • systems that can adjust the thermostat with the weather forecast data.

Maybe soon, smart homes will include AI enhanced automation of furniture to support independent living and maybe we will witness smart cities offering AI enhanced accessibility for PWDs.

Expected Outcome

Competitors are expected to provide solutions in the form of AI based assistive technology or physical aid, focusing on one or more of the following four areas:

Education

Existing examples include adaptive math teaching and learning applications, live captioning and , web-accessibility remedy solutions.

Employement

Existing examples include live captioning/translation/interpretation, text-to-speech, speech-to-text.

Independent Living and Home

Existing examples include AI enhanced wheelchairs, the AI enhanced prosthesis, the adaptive smart home hubs, and i-RIDE.

Community

Existing examples include video captioning, audio books, live captioning/translation/interpretation, AI-based visual aids, smarter glasses, cognitive hearing aid, sign-to-text, web-accessibility remedy solutions.

  • Date

From the 24th → 26th of JUNE.

  • Prizes

Winners will get $20000 and the opportunity for incubation at iPARK.

  • Target Participants/Groups
  • A team of 2-4 participants
  • Must be from the MENA region
  • Must have a team member affiliated with AUB being an alumni, graduate student, undergraduate student, staff, or faculty.
  • Must use AI for creating an assistive technology or physical aid solution
  • Important Dates
May 24

Kickoff and Orientation Event

 

Time May 24
5h00 PM Opening Note with Dr. Yousif Asfour – CITO AUB
5h10 PM Keynote Speech: Global Accessibility Awareness Day (GAAD) – Jennison Assenssion
5h20 PM Microsoft – Microsoft AI Technologies for Accessibility Samer Chidiac
5h40 PM Mada Innovation Program – An Ecosystem for Innovators Dr. Ashraf Othman
5h50 PM Zaka and Beirut AI – All About the AI Community – Christophe Zoghbi
6h00 PM Break
6h10 PM Orientation – Hackathon Kickoff
6h30 Inclusive – Accessibility by Design – Saira Sayed & Hafsa Qadeer
7h00 PM

Discussion Panel: AI for Accesibility – the Role of Startups

Key2Enable – Jose Rubinger

Jade Autism – Ronaldo Cohin 

Takalam Tech – Abdallah Al-Faris 

June 10

Application Deadline for the Hackathon

June 15

Shortlisted Teams

 

June 18, 19

Capacity Building Workshops

June 20

Video Pitch Submission

June 24

Finalists Selection

June 25, 26

Hackathon and Awards Ceremony

  • Final Participants

We are thrilled to share the semi-finalists who made it to the final ceremony on Saturday June 26th:

 

 

Team Name

 

 

Team Members

 

 

Idea

Esmaani

Zahraa Baysouni

Jana Kabrit

Abelrahim Al Mohammad

Nabil El Miri

Esmaani is a platform translates sign language to text using your device camera. It reads translated text loud for safety and social distancing. Translates text to sign language in Arabic and English. Translates audio to sign language and has an emergency feature. It can be used on phones and we aim to create a commercial model that can be installed in companies, government buildings, and public spaces.
 Kanari AI

Massa Baali

Ahmed Ali

Amir Hussein

Ryan Carmichael

 

Kanari AI’s unique live Dialectal Arabic application could be used to transcribe and share meetings, presentations, and other conversations in real time with users via QR code. This live speech to text application allows those with hearing challenges to understand and enjoy the topics being presented.
 Vision Restored

Leila Habli

Ola Ghattas

Leen Ghattas

 Vision Restored aim to develop glasses with a connected speaker and microphone that allows people who are blind to perform their daily tasks independently thus adapt to daily life.
 اشرلي

Abdelrahman Abozied

Zainab AlMeraj

Rana Chams Bacha

Omar Bekdache

 

اشرلي are proposing an AI model that can automatically translate the Arabic Sign Language to text with the goal of offering those with hearing impairments a chance for independent living and an easier way to integrate within the hearing communities. Our tool will start with an existing dataset from which we will extract signs in frames and use them in training and testing a neural network model. We will then evaluate the findings with competent sign language translators and enhance the model using the feedback
Kratu AI (Predict Practice Prevent – Neuro Disorder)

 Karim Hout

Navpreet Bhatti

Mahasen Hamze

Angayarkanni Pitchumani

Mobile App with machine learning algorithm gathers the trajectory of the finger movements over the screen with respect to time and compare with the dataset of healthy individuals to predict the early stage of neuro disorders from three simple tasks, in which the patients have to touch the screen with one or various fingers as close as possible to the indicated targets. App includes following modules for the patients to help them in 1) Activities of daily living based cognitive training 2) Monitoring 3) Dementia screening 4) Socialization through Chatbot 5) Tracking 6) Recommendation of Rehabs Thereby Prevent later stage
Mighty minders

Jack Khoueiry

William Salame

Joelle Georgeous

Hand sized device that will assist the user though every task. It will replace his phone, his laptop, his Tv, and even his walking stick, etc… and more future that will make the user feel more and more less different.
 ComAi

Zein Shehabeddine

Mohamad Ali Tabbara

Omar Al Itani

Hassan Jaber

 

Project aims to help people that are deaf and/or mute to communicate in English with others with the help of technology, where the program detects American Sign Language (ASL) as input and translates it into natural language. This process is done by training a neural network on a dataset using Tensorflow/Pytorch.
 The Glens

 Nijad AlDubayssi

Amine Berjaoui Tamhaz

Rafic Al Ayass

Frederic Aboud

The Glens propose a prosthetic hand with autonomous action capabilities. The hand includes a camera that uses Computer Vision object detection algorithms (Neural Network) to associate an object (cup, pen) to a grasp (cylindrical, pinch). Then the hand proceeds to execute the full grasping process with minimal user input. The user only acts as a stop/continue switch by providing a single myoelectric signal from the residual muscles of the forearm.
 EventPro

 Ali Charaf

Nizam Mahanna

 

The software provides assistance that majority of virtual platforms lack such as voice command virtual assistance sign language avatar and caption
 Cortex

 Dhia Eddine Nini

Nabil Houari

Ralph Akiki

Tala El Masri

Cortex is a software-based solution that uses AI tracking software through a 12hour period to assess executive function skills and ADHD. Cortex uses complex AI calculations through eye-tracking technology and how you use your computer throughout the day. It calculates: 1) your attentiveness and distractions in web pages and open software; 2) your shift from a Task-Positive-Network brain mode to Default-Mode-Network, which is your non-focusing, thinking, imagining, and sometimes ruminating brain mode); 3) Your stress tolerance and emotional control through a combination of facial recognition, speed changes, and text, among others; 4) Your task initiation, sustained attention, and goal-directed persistence by comparing planned goal, execution, and studying your performance; Among other solutions to different executive functions. Cortex provides a score and an accurate to life assessment in each skill, along with tailored recommendations. Cortex is used for ADHD and autism assessment, as well as a productivity assessment tool for career-oriented highly skilled professionals or trainers for kids in schools.
MLTK

Malaz Tamim

Louay Abo Alzahab

Kinana Al-Rimawi

Tarek Sheikh Al-Shbab

 

 To help people with ALS and similar disabilities, our proposed solution is building a non-standard speech to text tool (with Arabic support) which can help in understanding what they are saying. The tool will be built as SAAS (Software as a Service) to help other developers use this tool and implement it in their applications. This will help spread this technology and we will see it in more applications that can help in making disabled people’s lives easier. For example, the tool can be implemented in an IOT app that controls the person’s room lighting, tv, and any other electric tool he uses in his room using his voice.
 Wheely wheel Team

 Karen Kordab

Youssef Jaafar

Cyrine Soufi

Layal Tannous

 

Wheely Wheel Team is a map that will include you wherever you go. Wheely Wheel Team presents a revolutionary map that offers people with special needs the space to navigate places that are accessible to them based on their disability type. Wheely Wheel Team caters to a world that is customized to every person based on his/her needs. It is a map that features places based on their accessibility metrics. It plays the role of rewarding and featuring accessible places and motivates inaccessible places to start their accessibility movement. Wheely Wheel Team envisions a world that gives everyone the right to span the world stress-free.
 Reach-Up

Hisham Ramadan

Mira Khaled 

 – Deliver online teaching from the safety of the student’s home.

– Provide an engaging and varied curriculum.

– Monitor the progress through the AI platform configured for adapted learning.

 

  • Previous Events

Join us on May 24, at 5:00 PM, to get introduced to the AI for Accessibility Hackathon, meet our partners and get to learn from experts who are presenting solutions to people with determination today.

For any further inquiries, contact us at: zeinaubipark@aub.edu.lb

 

  • Useful Resources

In Collaboration with:

sponsored by:

Powered by: