Dzongkha Next Words Prediction Using Bidirectional LSTM
DOI:
https://doi.org/10.17102/bjrd.rub.se2.038Keywords:
Dzongkha word prediction, Machine Learning, Recurrent Neural Network, Long Short-Term Memory, Bidirectional LSTMAbstract
Dzongkha Development Commission of Bhutan (DDC) is trying to computerize Dzongkha. However, the computerization of Dzongkha poses numerous challenges. Currently, the support for Dzongkha in modern technology is limited to printing, typing, and storage. Typewriting a single Dzongkha word requires several keypresses. As a result, typing Dzongkha is tedious. In this paper, the Dzongkha word label prediction was studied. The purpose of the study was to further reduce keystrokes and make Dzongkha typing much faster. The dataset encompasses different genres curated by DDC. The dataset consisted of 10000 sentences and 4820 unique words. Next, 52150 sequences were generated using N-gram methods followed by vectorizing text using embedding techniques. Different RNN-based models were evaluated for the next Dzongkha words prediction. Two Bi-LSTM layers with 512 hidden layer neurons gave the best accuracy of 73.89% with a loss of 1.0722.Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Karma Wangchuk, Tandin Wangchuk, Tenzin Namgyel
This work is licensed under a Creative Commons Attribution 4.0 International License.
All articles published in BJRD are registered under Creative Commons Attribution 4.0 International License unless otherwise mentioned. BJRD allows unrestricted use of articles in any medium, reproduction and distribution by providing adequate credit to the authors and the source of publication.