Indian student develop AI model for Real-Time Translation of Sign language

where is ai used

A student from Tamil Nadu India has developed an AI model to achieve Real-Time sign language translation a rare feat that so far no big Tech company has achieved.

Looking to do something outside the box as a third-year Engineering Student, Priyanjali Gupta. She Created an AI Model that translates American Sign Language(ASL) into English instantly. Priyanjali shared the project using LinkedIn a professional networking and career development site she wrote in her post “Made this Model using this cool TensorFlow object Detection API the model translates a few ASL signs to English using Transfer learning from pretrained ssd_mobilenet model.” The model translates basic sign codes such as Hello,Bye, and Thank you. She also shared a demo video that showcase the application in action.

Development of Model

In its initial stage, the model only translates some basic ASL signs. In an interview, she said that ” The data-set for the AI model uses Python Library aimed at real-time computer vision which takes images from the Webcam. Responding to one of the comments on her LinkedIn post she admitted that ” to develop a machine learning model solely for sign language detection is a hard task to achieve but not unfeasible and that she believe sooner or later the much more larger community of machine learning developers will be able to achieve deep learning model for solely sign language purposes “.

Also, she dedicated the project to Nicholas renotte’s , a Senior Data Science and AI Specialist at IBM whose tutorials on real-time sign language really helped her.

AI is one of the fastest-growing technologies in the world which has applications in every field from Healthcare, Manufacturing, Banking to Astrophysics,Life Sciences. Simply put AI allows the organization to make better data-driven decisions improving the core process resulting in speed and accuracy of decision making.

Exit mobile version