Digital Repository

Befast App for Early Stroke Detection

Show simple item record

dc.contributor.author Amarasekara, Diluni
dc.date.accessioned 2024-02-15T06:53:19Z
dc.date.available 2024-02-15T06:53:19Z
dc.date.issued 2023
dc.identifier.citation Amarasekara, Diluni (2023) Befast App for Early Stroke Detection. MSc. Dissertation, Informatics Institute of Technology en_US
dc.identifier.issn 20220444
dc.identifier.uri http://dlib.iit.ac.lk/xmlui/handle/123456789/1695
dc.description.abstract "This document describes the development and evaluation of a smart early stroke detection AI system to identify early stroke symptoms in patients ineligible for pharmacological or surgical treatment. Strokes kill 5.5 million people worldwide, making them the second largest cause of death. Nearly 800,000 Americans have strokes annually, costing $34 billion. Over 50% of stroke patients have chronic impairment, indicating considerable morbidity. When stroke prevents oxygen from reaching the brain, cells die immediately. Time is crucial for treatment. Because 80% of stroke patients cannot read early stroke symptoms outside of the hospital, early detection is vital to preventing severe brain injury. The author developed a completely automated human gesture, posture and speak detection system utilizing BEFAST that can be used everywhere to solve this problem. The acronym stands for Balancing, Eyesight, Facial drooping, Arm quivering, Speech, and Time to contact emergency contacts. This implementation used hybrid neural network and gradient boost classification. These models learned human position decoding and lip-reading to recognize symptoms independently and call emergency contacts. The implemented early stroke detection system was evaluated using human position detection and lip-reading models. Both models were evaluated using confusion matrices, accuracy, precision, recall, and F1 score. The lip-reading model was also evaluated at the character and word levels, including CER and WER estimates. The human pose detection model detected stroke symptoms 100% accurately using gestures and postures. The lip-reading model had character-level evaluation accuracy above 95% and a CER of 0.73%. The word-level examination showed a WER of 2.21%, proving the system's lip sync reading capacity to detect stroke symptoms." en_US
dc.language.iso en en_US
dc.publisher IIT en_US
dc.subject Early stroke detection en_US
dc.subject Neurology en_US
dc.subject Real-time monitoring en_US
dc.subject Artificial Intelligence en_US
dc.subject Data science en_US
dc.title Befast App for Early Stroke Detection en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Advanced Search

Browse

My Account