Digital Repository

GenTex: A Variant of LeakGAN for Text Generation

Show simple item record

dc.contributor.author Varatharajah, Vaseekaran
dc.date.accessioned 2023-01-04T03:35:50Z
dc.date.available 2023-01-04T03:35:50Z
dc.date.issued 2022
dc.identifier.citation Varatharajah, Vaseekaran (2022) GenTex: A Variant of LeakGAN for Text Generation. BEng. Dissertation, Informatics Institute of Technology en_US
dc.identifier.issn 2018617
dc.identifier.uri http://dlib.iit.ac.lk/xmlui/handle/123456789/1255
dc.description.abstract "Text Generation is the task associated with predicting the next word serially and continuously for a given a sequence of text data utilizing language models. A primary use of text generation is data synthesis. Availability of data in machine learning and deep learnings tasks is a constant requirement which can be addressed through generating new data by looking at existing examples. The evolution of text generation began with statistical models which then progressed to the use of neural networks. However, in both cases, the quality of generated text was limited due to the learning patterns of these models and architectures. These learning patterns should understand context well. To that end, Recurrent Neural Networks (RNN) have gained prominence in addressing contextual understanding for text generation. However, the architectural design of RNN have shown to produce less attractive results in text generation due to their inability to process long text sequences. Though, Long Short-Term Memory addresses this issue, it is less effective under adversarial manipulation. This has turned researchers to utilize GANs for text generation. The following research focuses on text generation using a modified GANs using COCO Image Captions dataset for training and evaluating. This thesis consists of workings done to modify the architecture of LeakGAN, a GAN-based model for unconditional text generation. The variation of LeakGAN, using Gated Recurrent Units components for the manager and worker modules and modifying the training pattern of the discriminator, displayed improved results: BLEU score of 0.836 successfully beating the benchmark performance of LeakGAN architecture. Thus, this research study has effectively addressed the problem centered around using GANs and models such as RNNs for text generation." en_US
dc.language.iso en en_US
dc.subject Generative Adversarial Networks en_US
dc.subject Text Generation en_US
dc.subject Recurrent Neural Networks en_US
dc.subject Gated Recurrent Units en_US
dc.title GenTex: A Variant of LeakGAN for Text Generation en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Advanced Search

Browse

My Account