NLP-based Sign Gesture Identification for Disabled People

Authors

  • Pankaj Saraswat SOEIT, Sanskriti University, Mathura, Uttar Pradesh, India Author

DOI:

https://doi.org/10.55524/

Keywords:

Communication, Hearing and speech, NLP, Parsing, Sign Language

Abstract

There are many methods for identifying  signs, each of which generates a word for each one. It  focuses on converting sign language into an appropriate  English sentence. NLP techniques are also used in addition  to sign recognition. The input is a framed and split video of  sign language. This booklet teaches deaf and mute people  sign language. It's tough for non-blind persons to engage  with blind people due to communication difficulties. To  address this issue, the article suggests and describes an  effective method. Language technology methods such as  POS tagging and the LALR parser are used to convert  identified sign words into English phrases. A number of  applications are currently on the industry that allow blind  people to interact with the world. Combining technology  will not be able to address the problem of mobile sign  language translation in daily activities. A video interpreter  can assist deaf or hearing-impaired people in a variety of  situations. People with hearing impairments will be able to  learn sign language and have films translated into sign  language as a consequence of this research. The present  work may be used as a communication interface for both  speech-impaired and non-speech-impaired individuals. It  will assist bridge the communication gap between speech impaired people and the rest of the population by capturing  and analyzing signals, as well as recognizing and displaying  output in the form of comprehensible phrases. 

Downloads

Download data is not yet available.

References

Basar S, Adnan A, Khan NH, Haider S. Color Image Segmentation Using K-Means Classification On RGB Histogram. Recent Adv Telecommun Informatics Educ Technol. 2014;

Mohandes M, Liu J, Deriche M. A survey of image based Arabic sign language recognition. In: 2014 IEEE

th International Multi-Conference on Systems, Signals and Devices, SSD 2014. 2014.

Wu CH, Chiu YH, Guo CS. Text generation from Taiwanese sign language using a PST-based language model for augmentative communication. IEEE Trans Neural Syst Rehabil Eng. 2004;

Rajam PS, Balakrishnan G. Real time Indian Sign Language Recognition System to aid deaf-dumb people. In: International Conference on Communication Technology Proceedings, ICCT. 2011.

Wilson AD, Bobick AF. Parametric hidden Markov models for gesture recognition. IEEE Trans Pattern Anal Mach Intell. 1999;

Dabre K, Dholay S. Machine learning model for sign language interpretation using webcam images. In: 2014 International Conference on Circuits, Systems, Communication and Information Technology Applications, CSCITA 2014. 2014.

Zimmermann M, Chappelier JC, Bunke H. Offline grammar-based recognition of handwritten sentences. IEEE Trans Pattern Anal Mach Intell. 2006;

Mehdi SA, Khan YN. Sign language recognition using sensor gloves. In: ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age. 2002.

Surabhi MC. Natural language processing future. In: 2013 International Conference on Optical Imaging Sensor and Security, ICOSS 2013. 2013.

Jung C, Kim C, Chae SW, Oh S. Unsupervised segmentation of overlapped nuclei using bayesian classification. IEEE Trans Biomed Eng. 2010;

Downloads

Published

2021-11-30

How to Cite

NLP-based Sign Gesture Identification for Disabled People . (2021). International Journal of Innovative Research in Computer Science & Technology, 9(6), 41–45. https://doi.org/10.55524/