An integrated mediapipe-optimized GRU model for Indian sign language recognition

Barathi Subramanian, Bekhzod Olimov, Shraddha M. Naik, Sangchul Kim, Kil Houm Park, Jeonghong Kim

Research output: Contribution to journalArticlepeer-review

93 Scopus citations

Abstract

Sign language recognition is challenged by problems, such as accurate tracking of hand gestures, occlusion of hands, and high computational cost. Recently, it has benefited from advancements in deep learning techniques. However, these larger complex approaches cannot manage long-term sequential data and they are characterized by poor information processing and learning efficiency in capturing useful information. To overcome these challenges, we propose an integrated MediaPipe-optimized gated recurrent unit (MOPGRU) model for Indian sign language recognition. Specifically, we improved the update gate of the standard GRU cell by multiplying it by the reset gate to discard the redundant information from the past in one screening. By obtaining feedback from the resultant of the reset gate, additional attention is shown to the present input. Additionally, we replace the hyperbolic tangent activation in standard GRUs with exponential linear unit activation and SoftMax with Softsign activation in the output layer of the GRU cell. Thus, our proposed MOPGRU model achieved better prediction accuracy, high learning efficiency, information processing capability, and faster convergence than other sequential models.

Original languageEnglish
Article number11964
JournalScientific Reports
Volume12
Issue number1
DOIs
StatePublished - Dec 2022

Fingerprint

Dive into the research topics of 'An integrated mediapipe-optimized GRU model for Indian sign language recognition'. Together they form a unique fingerprint.

Cite this