Detection and Recognition of Real-Time Hand Gestures Using YOLO and AlexNet Techniques

Raad Ahmed Mohamed, Karim Q. Hussein


At least three thousand five hundred million people in the world cannot hear or speak, they are called the deaf and dumb. Often this segment of society is partially isolated from the rest of community due to the difficulty of dealing, communicating and understanding. To address this problem, many solutions have been proposed that try bridging this gap between this segment and the rest of society and using the technical development of devices, especially computers and mobile phones. The science of image processing with artificial intelligence has been used to generate programs for converting natural speech into a sign language that enables the people with disabilities (the deaf and dumb) understand it. Initially, a set of sign dictionaries were made in the deaf and dumb language, including the Indian Sign Language (ISL), the American Sign Language (ASL), the European Dictionary and so on, and the main reason for this is to simplify the understanding of the sign language. These dictionaries depend mainly on the movement of hands, and one or both hands can be used to form special characters for this conversation. These dictionaries can be used after training this segment of people. Because of technological progress, this process can be automated using a computer and various programming languages in addition to secondary connections (cameras and microphones) and advanced algorithms for artificial intelligence. This goal can be reached by building a special program for communication between people with disabilities (deaf and dumb) and healthy people, or both directions. The research results show that the use of neural networks, especially convolutional neural networks, is very suitable in terms of accuracy, speed of performance and generality in processing the previously unused input data.


Keywords: the deaf and dumb, Indian sign language, American sign language, hand gesture, artificial intelligence, hand detection, convolutional neural networks.

Full Text:



BENITEZ-GARCIA, G., PRUDENTE-TIXTECO, L., CASTRO-MADRID, L.C., TOSCANO-MEDINA, R., et al. (2021) Improving Real-Time Hand Gesture Recognition with Semantic Segmentation. Sensors, 21(2), 356.

BHAME, V., SREEMATHY, R., & DHUMAL, H. (2014a) Vision Based Calculator for Speech and Hearing-Impaired using Hand Gesture Recognition. International Journal of Engineering Research & Technology, 3(6), 632-635.

BHAME, V., SREEMATHY, R., & DHUMAL, H. (2014b) Vision Based Hand Gesture Recognition Using Eccentric Approach for Human Computer Interaction. Proceedings of the 2014International Conference on Advances in Computing. Communications and Informatics (ICACCI), 949-953,

EMOND, A., RIDD, M., SUTHERLAND, H., ALLSOP, L., et al. (2015) The current health of the signing Deaf community in the UK compared with the general population: a cross-sectional study. BMJ Open, 5(1), article ID e006668.

FARZI, A., and TARJOMANNEJAD, A. (2015) Prediction of phase equilibria in binary systems containing acetone using artificial neural network. International Journal of Scientific & Engineering Research, 6(9), 358-363.

MITCHELL, R. E., YOUNG, T. A., BACHELDA, B., & KARCHMER, M. A. (2006) How Many People Use ASL in the United States?: Why Estimates Need Updating. Sign Language Studies, 6(3), 306-335.

NATIONAL INSTITUTE OF DEAFNESS AND OTHER COMMUNICATION DISORDERS. (2019) American Sign Language. U.S. Department of Health and Human Services; Available from

SAHOO, A., MISHRA, G., & RAVULAKOLLU, K. (2014) Sign language recognition: State of the art. ARPN Journal of Engineering and Applied Sciences, 9, 116-134.

SAWANT, S.N. (2014) Sign Language Recognition System to aid Deaf-dumb People Using PCA. International Journal of Computer Science & Engineering Technology, 5(05), 570-574.

WHO calls on private sector to provide affordable hearing aids in developing world. (2001) Indian Journal of Medical Sciences, 55(9), 511–513. Available from


  • There are currently no refbacks.