Paper Title
Using Real-Time Hand Gesture Tracking to Interact and Play with VR Insect Specimen Experiments

Abstract
In this paper, we come up with a more flexible and less cumbersome method to intuitively manipulate VR insect specimen applications by means of hand gesture tracking. In order to be in response to VR’s environmental stimuli, users are used to taking hand-held controllers to interact within VR. In contrast, we exert HTC Vive Pro headset and ZED Mini stereo camera that is mounted to the front of the headset to capture user’s hand gestures for the sake of replacing traditional VR controllers. The key insight behind our approach is that we take advantage of convolutional neural networks to track wrist, palm, and fingertips of a single hand as well as identifying visual imagery. We divide the machine-learning process into two convolutional neural network models to locate the palm area and to detect 2D positions of fingertips. In addition, the proposed method performs inverse kinematics to restore a full hand gesture skeleton for identifying which commands are valid to be carried out. Experimental shows that the proposed method is able to apply hand gesture tracking with much better accuracy to interact and play in VR insect specimen experiments. Keywords - Virtual Reality, Hand Gesture Tracking, Convolutional Neural Networks, Inverse Kinematics, Insect Specimen Experiments.