Digital Repository

Advancing Brain Tumour Diagnosis through Image Processing with Explainable AI for Enhanced Interpretability: A Pathway for Interpretable Medical Decision Support [BrainX Assist]

Show simple item record

dc.contributor.author Dabura Vithanachchige, Janidu
dc.date.accessioned 2025-06-09T07:10:21Z
dc.date.available 2025-06-09T07:10:21Z
dc.date.issued 2024
dc.identifier.citation Dabura Vithanachchige, Janidu (2024) Advancing Brain Tumour Diagnosis through Image Processing with Explainable AI for Enhanced Interpretability: A Pathway for Interpretable Medical Decision Support [BrainX Assist]. BSc. Dissertation, Informatics Institute of Technology en_US
dc.identifier.issn 20200929
dc.identifier.uri http://dlib.iit.ac.lk/xmlui/handle/123456789/2482
dc.description.abstract "Brain tumour is one of the most common, yet deadly cancers seen in adults as well as children. Over the years the number of brain tumour cases has shown an increasing incidence. Early diagnosis plays a pivotal role in identifying and getting chances to cure cancer but is unsuccessful due to its timely processes. MRI (Magnetic Resonance Imaging) screening is commonly used as a part of diagnosing brain tumours, Still, this process requires a concentrated focus from healthcare professionals with expertise on the subject for extensive periods. Thus, there are possibilities of misdiagnoses due to human error. With this increasing concern and advancement of technology, Deep Learning algorithms have been implemented which have already achieved higher accuracy and robustness. However, these systems are untrustworthy, restricting them from being deployed in medical workflows due to a lack of reasonings behind the predictions. A blind prediction through an AI system is insufficient for healthcare professionals to make life-dependent judgements on humans. Through this research, a pipeline is proposed to automate brain tumour detection and classification of over 15 different types of brain tumours while getting both visual and textual explanations. The proposed architecture involves a transfer learning approach using the pre-trained EfficientNet-B7 model along with explainable AI (XAI) techniques which include LIME, Grad-CAM, Layer-CAM, SmoothGrad and Guided Back Propagation. Primarily the image is preprocessed by applying CLAHE (Contrast Limited Adaptive Histogram Equalization) and afterwards sent for classification. Secondarily, the predicted tumour type together with the DL model is input into the XAI techniques to generate the explanations. The results of the model are recorded as obtaining a validation accuracy of 94.22% with a loss of 0.2165 and the AUC-ROC is recorded as 0.99862." en_US
dc.language.iso en en_US
dc.subject Tumour Classification en_US
dc.subject Explainable AI (XAI) en_US
dc.subject MRI Images en_US
dc.title Advancing Brain Tumour Diagnosis through Image Processing with Explainable AI for Enhanced Interpretability: A Pathway for Interpretable Medical Decision Support [BrainX Assist] en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Advanced Search

Browse

My Account