“Signconnect - Bilingual Asl And Isl Gesture Detection System Using Deep Learning Techniques”
DOI:
https://doi.org/10.47392/IRJAEM.2026.0269Keywords:
Sign Language Recognition, YOLOv11, MediaPipe, LSTM, Gesture Recognition, Text-to-SignAbstract
Communication between the deaf population and normal hearing people continues to be a problem since the interpretation of sign languages is still not well understood. Available communication systems concentrate on the one-way approach of translating gestures into text or vice versa, which constrains communication. In this context, we propose the SignConnect system, a two-way translation approach combining both gestures to text and text to sign pipelines. In the gesture-to-text approach, we use You Only Look Once Version 11 (YOLOv11) to detect hand gestures quickly and MediaPipe for extracting keypoint information. Then we utilize a Long Short-Term Memory (LSTM) neural network for recognizing gestures. In the other direction, we propose a text-to-sign translation system by first normalizing the input text and then tokenizing it into a sign language dictionary and creating animated gestures using 2D/3D visualization. The proposed approach combines both approaches using the Open-Source Computer Vision Library (OpenCV) library and creates a frontend application for video input and text and animation output in real-time. The results obtained confirm the efficiency of the approach.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2026 International Research Journal on Advanced Engineering and Management (IRJAEM)

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
.