dc.contributor.author |
Rashad, Mohamed |
|
dc.date.accessioned |
2024-04-29T04:31:10Z |
|
dc.date.available |
2024-04-29T04:31:10Z |
|
dc.date.issued |
2023 |
|
dc.identifier.citation |
Rashad, Mohamed (2023) SinhalaTextGenie - A Sinhala Language Model Trained To Predict and Generate Words. BSc. Dissertation, Informatics Institute of Technology |
en_US |
dc.identifier.issn |
2018470 |
|
dc.identifier.uri |
http://dlib.iit.ac.lk/xmlui/handle/123456789/2081 |
|
dc.description.abstract |
"As the use of Sinhala in the digital space continues to increase, there is a growing need for
advanced Sinhala NLP tools that can enhance user experience and contribute to the language's development. However, Sinhala lags behind privileged languages in terms of NLP tools, lacking reliable features such as automated content generation, next word predictions, chatbots, and auto-replies. To address this gap, it is crucial to develop and contribute to the advancement of reliable Sinhala language models, which can promote the growth of the language in the digital realm.
The transformer models in the space of language modeling have proven to give promising results. Despite this no research has been conducted in developing a Sinhala transformer model for text generation. Therefore this work trains a decoder only transformer model on 300000 of diverse sinhala sentences which contains 9 words on average and shows the impressive results it can achieve. 5 decoder layers and 4 attention heads were used to compile the model.
This study shows the accurate predictions of the Sinhala transformer language model that has been developed." |
en_US |
dc.language.iso |
en |
en_US |
dc.subject |
Transformer model |
en_US |
dc.subject |
Text generation |
en_US |
dc.subject |
Decoder-only transformer model |
en_US |
dc.title |
SinhalaTextGenie - A Sinhala Language Model Trained To Predict and Generate Words |
en_US |
dc.type |
Thesis |
en_US |