Abstract—Hand gesture recognition (HGR) in real-time and with precision has become an important research topic. In this article, a loose hand gesture recognition (LHGR) system based on relational features using a depth sensor is implemented, which not only maintains an impressive accuracy in real-time processing but also enables the user to use loose gestures. HGR can usually be divided into three stages: hand detection, hand feature extraction, and gesture classification. However, the method we propose has been useful in improving all the stages of HGR. In the hand detection stage, we propose a ROI dynamic estimation method and a wrist-cutting method that conform to the characteristics of a human hand. In the feature extraction stage, we use the more reliable relational features which are constructed by local features, global features, and depth coding. In the gesture classification stage, we use three layers of classifiers including finger counting, finger name matching, and coding comparison; these layers are used to classify 16 kinds of hand gestures. In the end, the final output is adjusted by an adaptive decision. The average processing speed per frame is 38.6 ms. Using our method has resulted in an average accuracy of standard gestures of about 98.29%, and an average accuracy of loose gestures of about 88.32%. In summary, our LHGR system can robustly classify hand gestures and still achieve acceptable results for loose gestures.
Index Terms—Computer vision, hand gesture recognition, human-computer interaction, image processing, kinect.
Chen-Ming Chang and Din-Chang Tseng are with the Institute of Computer Science and Information Engineering, National Central University, Jhongli, Taiwan 32001 (e-mail: 985402003@cc.ncu.edu.tw, tsengdc@ip.csie.ncu.edu.tw).
[PDF]
Cite: Chen-Ming Chang and Din-Chang Tseng, "Loose Hand Gesture Recognition Based on Relational Features Using a Depth Sensor," International Journal of Future Computer and Communication vol. 7, no. 2, pp. 26-31, 2018.