A Handbook of Computational Linguistics: Artificial Intelligence in Natural Language Processing

Machine Translation of English to Hindi with the LSTM Seq2Seq Model Utilizing Attention Mechanism

Author(s): Sunil Kumar*, Sandeep Kumar Vishwakarma, Abhishek Singh, Rohit Tanwar and Digvijay Pandey

Pp: 192-210 (19)

DOI: 10.2174/9789815238488124020012

* (Excluding Mailing and Handling)

Abstract

Machine translation uses Natural Language Processing (NLP) to automatically translate text across languages. Business globalization and the internet have made it more popular. Machine translation may be handy for rapidly comprehending foreign language content, but it is not always precise or dependable, particularly for complicated or idiomatic languages. The research presents a neural machine translation approach based on the sequence-to-sequence (Seq2Seq) architecture using Uni-LSTM and Bi-LSTM with and without attention mechanisms for translating English sentences into Hindi sentences. We investigated a variety of procedures for the construction of machine translation models, such as the Seq2Seq model and attention processes. We trained the model on a large parallel corpus of English-to-Hindi sentence pairs and evaluated it on a separate test set. The efficacy of our approach was demonstrated by the high level of BLEU score achieved, which was 14.76 by the Bi-LSTM with attention mechanism in contrast to the Uni-LSTM in translating an English sentence into a Hindi sentence. Our research endeavours to achieve a high level of performance in machine translation on the test set and. Our results suggest that the proposed Seq2Seq model with attention mechanisms is a promising approach for English-to-Hindi machine translation.

© 2024 Bentham Science Publishers | Privacy Policy