• Dec 05, 2023 News!Vol.12, No.4 has been published with online version.   [Click]
  • Jan 04, 2024 News!IJFCC will adopt Article-by-Article Work Flow
  • Sep 05, 2023 News!Vol.12, No.3 has been published with online version.   [Click]
General Information
Editor-in-chief

Prof. Pascal Lorenz
University of Haute Alsace, France
 
It is my honor to be the editor-in-chief of IJFCC. The journal publishes good papers in the field of future computer and communication. Hopefully, IJFCC will become a recognized journal among the readers in the filed of future computer and communication.

IJFCC 2018 Vol.7(2): 26-31 ISSN: 2010-3751
DOI: 10.18178/ijfcc.2018.7.1.515

Loose Hand Gesture Recognition Based on Relational Features Using a Depth Sensor

Chen-Ming Chang and Din-Chang Tseng

Abstract—Hand gesture recognition (HGR) in real-time and with precision has become an important research topic. In this article, a loose hand gesture recognition (LHGR) system based on relational features using a depth sensor is implemented, which not only maintains an impressive accuracy in real-time processing but also enables the user to use loose gestures. HGR can usually be divided into three stages: hand detection, hand feature extraction, and gesture classification. However, the method we propose has been useful in improving all the stages of HGR. In the hand detection stage, we propose a ROI dynamic estimation method and a wrist-cutting method that conform to the characteristics of a human hand. In the feature extraction stage, we use the more reliable relational features which are constructed by local features, global features, and depth coding. In the gesture classification stage, we use three layers of classifiers including finger counting, finger name matching, and coding comparison; these layers are used to classify 16 kinds of hand gestures. In the end, the final output is adjusted by an adaptive decision. The average processing speed per frame is 38.6 ms. Using our method has resulted in an average accuracy of standard gestures of about 98.29%, and an average accuracy of loose gestures of about 88.32%. In summary, our LHGR system can robustly classify hand gestures and still achieve acceptable results for loose gestures.

Index Terms—Computer vision, hand gesture recognition, human-computer interaction, image processing, kinect.

Chen-Ming Chang and Din-Chang Tseng are with the Institute of Computer Science and Information Engineering, National Central University, Jhongli, Taiwan 32001 (e-mail: 985402003@cc.ncu.edu.tw, tsengdc@ip.csie.ncu.edu.tw).

[PDF]

Cite: Chen-Ming Chang and Din-Chang Tseng, "Loose Hand Gesture Recognition Based on Relational Features Using a Depth Sensor," International Journal of Future Computer and Communication vol. 7, no. 2, pp. 26-31, 2018.

Copyright © 2008-2024. International Journal of Future Computer and Communication. All rights reserved.
E-mail: ijfcc@ejournal.net