Main Article Content
Aims: The study aimed to determine the specific features responsible for the recognition of gestures, to design a computational model for the process and to implement the model and evaluate its performance.
Place and Duration of Study: Department of Computer Engineering, Federal Polytechnic, Ede, between August 2017 and February 2018.
Methodology: Samples of hand gesture were collected from the deaf school. In total, 40 samples containing 4 gestures for each numeral were collected and processed. The collected samples were pre-processed and rescaled from 340 × 512 pixels to 256 × 256 pixels. The samples were examined for the specific characteristics responsible for the recognition of gestures using edge detection and histogram of the oriented gradient as feature extraction techniques. The model was implemented in MATLAB using Support Vector Machine (SVM) as its classifier. The performance of the system was evaluated using precision, recall and accuracy as metrics.
Results: It was observed that the system showed a high classification rate for the considered hand gestures. For numerals 1, 3, 5 and 7, 100% accuracy were recorded, numerals 2 and 9 had 90% accuracy, numeral 4 had 85.67% accuracy, numeral 6 had 93.56%, numeral 8 had 88% while numeral 10 recorded 90.72% accuracy. An average recognition rate of 95% on tested data was recorded over a dataset of 40 hand gestures.
Conclusion: The study has successfully classified hand gesture for Yorùbá Sign Language (YSL). Thus, confirming that YSL could be incorporated into the deaf educational system. The developed system will enhance the communication skills between hearing and hearing impaired people.