dc.contributor.author |
Ranasinghe, R. A. Emesh |
|
dc.date.accessioned |
2020-07-11T10:28:51Z |
|
dc.date.available |
2020-07-11T10:28:51Z |
|
dc.date.issued |
2019 |
|
dc.identifier.citation |
Ranasinghe, R. A. Emesh (2019) A video chatting interface for deaf people. BSc. Dissertation Informatics Institute of Technology |
en_US |
dc.identifier.other |
2014108 |
|
dc.identifier.uri |
http://dlib.iit.ac.lk/xmlui/handle/123456789/473 |
|
dc.description.abstract |
Communication is the only process through which we individuals as a whole can share our thoughts, express our feelings and so on, but for a deaf/dumb people or simply a person with the inability to speak communication poses somewhat of a challenge in their day to day life. Due to these challenges they are more and more likely to experience some sort of social dilemmas as their methods of communication differ from normal people’s such as using sign language or texting. It is fundamentally crucial that communication be treated as an equal factor between both parties. Therefore, the goal of this project is to propose a new system that will help bridge the communication barrier that has existed between deaf/dumb people and normal hearing people. This system will make use of an Android video chatting application that gets inputs in the form of gestures when a deaf/dumb person uses it and takes each frame and it goes through image processing and translates given gestures into text, whereas in the condition for a normal hearing person the application simply translates audio into ASL emoticons which makes it easier for deaf/dumb people to understand rather due to the fact that they are simple hand gesture emoticons. |
en_US |
dc.subject |
Image Processing |
en_US |
dc.subject |
Gesture Recognition |
en_US |
dc.subject |
Convolutional Neural-Networks |
en_US |
dc.title |
A video chatting interface for deaf people |
en_US |
dc.type |
Thesis |
en_US |