Asl Recognition Using Leap Motion and Hand Tracking Mechanism
Sign Language is a widely used method of communication among the community of deaf mute people. It contains some series of body gestures, which enables a person to interact without the need of spoken words. Although the use of sign language is very popular among the deaf mute people but the other communities don't even try to learn it, this creates a gulf of communication and hence becomes a cause of the isolation of physically impaired people. A system is required to facilitate a way of communication between these two communities. This paper will demonstrate a method for recognition of American Sign Language (ASL) gestures using Leap Motion Camera Controller. Features extraction techniques are obtained from hand by using Leap Motion Camera in order to be entered to Artificial Neural Networks (ANN) classifier to develop a model to recognize hand gestures for both static and dynamic signs. Along with that, a robotic arm is built with Arduino Uno Micro-Controller and two servos motors to track the hand to keep it in the small viewing domain of the Leap Motion while performing gestures. The dataset obtained is 3600 images for ASL from which 3400 images were taken for 24 static letters and 10 numbers, in addition to 2 images for dynamic letters (Z, J). In this way, ten images were taken for every letter and number from 10 different signers to give the total of 3600 dataset images. Recognition of ASL images were applied by obtaining feature extraction to be entered as inputs to ANN classifier to yield an overall average success rate of 84.66% and a weighted average success rate of 83.11% accuracy in the recognition of ASL using Leap Motion.
Keywords- leap Motion, ASL, Image processing, feature extraction, DCT, ANN,Arduino Uno Micro-Controller, Servo motors, Robotic Arm.