dc.contributor.author |
Millath Mahir, Mahboob Umar Mahmoodh |
|
dc.date.accessioned |
2025-06-19T05:09:51Z |
|
dc.date.available |
2025-06-19T05:09:51Z |
|
dc.date.issued |
2024 |
|
dc.identifier.citation |
Millath Mahir, Mahboob Umar Mahmoodh (2024) Interpretable Deep Learning Architecture for Time Series Forecasting with Sentiment Analysis. BSc. Dissertation, Informatics Institute of Technology |
en_US |
dc.identifier.issn |
20200181 |
|
dc.identifier.uri |
http://dlib.iit.ac.lk/xmlui/handle/123456789/2681 |
|
dc.description.abstract |
"Time series (TS) forecasting is a crucial area in various domains, and it is a widely researched topic during the recent past. Even though many notable research and innovations have been conducted in this area with the help of Machine Learning (ML), Deep Learning (DL), Natural Language Processing (NLP), these approaches frequently lack interpretability, making it challenging to comprehend the elements influencing predictions. Furthermore, adding sentiment analysis to forecasting models might yield insightful results; however, combining these features with interpretability requirements is still a challenge in the field of TSF.
To address this gap, the thesis suggests a novel deep learning architecture for time series forecasting that makes use of sentiment analysis from stock news and integration of Explainable Artificial Intelligence (XAI) allowing for result interpretability. Modern natural language processing (NLP) approaches such as VADER are used in the suggested model to merge sentiment ratings derived from financial news with Gated Recurrent Units (GRU). The model's predictions were explained in plain and understandable terms using LIME (Local Interpretable Model-agnostic Explanations) to improve interpretability and compared with results from other XAI libraries like SHAP (SHapley Additive exPlanations). Metrics such as Mean Squared Error (MSE), Mean Absolute Error (MAE), and Root Mean Squared Error (RMSE) were used for evaluation. The outcomes showed that prediction accuracy was greatly increased by combining sophisticated deep learning models with sentiment analysis. Furthermore, XAI was integrated to generate easily interpretable text based explanations." |
en_US |
dc.language.iso |
en |
en_US |
dc.subject |
Time Series (TS) Forecasting |
en_US |
dc.subject |
Explainable Artificial Intelligence (XAI) |
en_US |
dc.subject |
Sentiment Analysis |
en_US |
dc.title |
Interpretable Deep Learning Architecture for Time Series Forecasting with Sentiment Analysis |
en_US |
dc.type |
Thesis |
en_US |