Digital Repository

AI chatbot for Sinhala sign language interpreter using hand gesture recognition

Show simple item record

dc.contributor.author Hannan, Arshad
dc.date.accessioned 2022-12-20T06:00:53Z
dc.date.available 2022-12-20T06:00:53Z
dc.date.issued 2022
dc.identifier.citation Hannan, Arshad (2022) AI chatbot for Sinhala sign language interpreter using hand gesture recognition. BEng. Dissertation, Informatics Institute of Technology en_US
dc.identifier.issn 2018197
dc.identifier.uri http://dlib.iit.ac.lk/xmlui/handle/123456789/1198
dc.description.abstract "The ability to view, listen, talk, and respond appropriately to events is one of every human being’s most priceless gifts. But some unfortunate individuals are denied access to such key characteristics. It has always been challenging for Deaf and Mute individuals to communicate with their peers on a regular basis. Not only do they face such communicative challenges but with the rapid rise in the technological Era they face many technological barriers while interacting with different devices because most applications are not developed considering the factor of deaf/mute user-friendly interactivity. Communication with AI chatbots have vastly made different activities very efficient throughout many industries but due to the disabilities deaf/mute individuals face, such AI chatbots consisting of auditory and oral interactions cannot be utilized by deaf/mute individuals. The following application is constructed such that the AI chatbot has the ability to communicate with Deaf/mute individuals by detecting sign language through gesture recognition. The following provides a real-time sign language detection model which also includes an NLP based AI chatbot model that understands human spoken language. The gesture recognition model recognizes the provided hand gestures the user inputs and feeds it to the chatbot model where it constructs a meaningful reply to the user. Accuracies and losses among different methods were compared in-order to select the most accurate model among the existing object detections models for the Sign language detection. In conclusion to the accuracies and losses the LSTM Deep Learning model gives an accuracy of 96% henceforth the following was used in the following application. " en_US
dc.language.iso en en_US
dc.subject Deaf/mute en_US
dc.subject AI chatbot en_US
dc.subject Sign language en_US
dc.subject Disability en_US
dc.subject Impairments en_US
dc.title AI chatbot for Sinhala sign language interpreter using hand gesture recognition en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Advanced Search

Browse

My Account