Portable Device to Assistive Talk for Deaf and Dumb using Hand Gesture

  • Mr. Kallam Arun Kumar Reddy,
Keywords: Raspberry pi,cam,HRI,K-Means clustering or Lloyd’s.

Abstract

The gestures made by the disabled will be recorded. The hardware for this system is a RASPBERRY PI kit, which will be connected to a Raspberry Pi camera. The image acquisition process begins with the capturing of the input picture, followed by image pre-processing to separate the foreground image from the background, and finally feature extraction to extract the essential information. The retrieved picture is compared to the dataset, and a speech output for that motion is created. HRI is tough because it necessitates interaction between humans and robots. Because robots cannot directly understand human language, HRI requires medium for communication that robots can understand and humans can readily do, especially to aid deaf people, sick, and the elderly. As a result, giving commands to robots requires gesture recognition as a communication medium. Machine learning is a subfield of Artificial Intelligence (AI) that deals with the development of data-driven or information-based systems. In this study, two approaches for hand gesture identification as an input command for the Bioloid Premium Robot are discussed: Support Vector Machine (SVM) with directed acyclic graph (DAG) judgement and fuzzy C Means clustering The K-Means clustering algorithm, often known as Lloyd's algorithm, provided a method for grouping data based on the Euclidean concept of distance between all data components.

Published
2021-11-23
How to Cite
Mr. Kallam Arun Kumar Reddy,. (2021). Portable Device to Assistive Talk for Deaf and Dumb using Hand Gesture. Design Engineering, 15166-15176. Retrieved from http://www.thedesignengineering.com/index.php/DE/article/view/6653
Section
Articles