Paper Title
Deaf-Mute Hand Gestures Classification using Long-Short Term Memory & Multi-Layer Perceptron

Abstract
The study aimed to enhance the accuracy of the hand motion and gestures acknowledgment by utilizing the Long Short-Term Memory (LSTM), a feature extraction and Multi-layer Perceptron (MLP) as the classifier with regards to the classification of hand gestures to test the accuracy of the method compared to the other algorithms such as the LSTM+CNN (LCNN), CNN-RNN, VGG16, and DCNN+MCSVM. In this paper, a novel LSTM+MLP model is presented on the classification for American Sign Language (ASL) hand signs composed of a total of 400 images. Furthermore, based on the findings of the study, the model LSTM+MLP outperformed the previous algorithms which gained a 100% accuracy rate during the training and testing as shown in the tables, illustrations and graphs. Keywords - Classification, Long-Short Term Memory, Multi-layer Perceptron, Neural Networks