• Aug 09, 2018 News![CFP] The annual meeting of IJFCC Editorial Board, ICCTD 2019, will be held in Prague, Czech Republic during March 2-4, 2019.   [Click]
  • Aug 09, 2018 News!IJFCC Vol. 6, No. 1-No. 3 has been indexed by EI (Inspec).   [Click]
  • Dec 24, 2018 News!The papers published in Vol.7, No.1-No.2 have all received dois from Crossref.
General Information
    • ISSN: 2010-3751
    • Frequency: Bimonthly (2012-2016); Quarterly (Since 2017)
    • DOI: 10.18178/IJFCC
    • Editor-in-Chief: Prof. Mohamed Othman
    • Executive Editor: Ms. Cherry L. Chan
    • Abstracting/ Indexing: Google Scholar,  Crossref, Electronic Journals LibraryEI (INSPEC, IET), etc.
    • E-mail:  ijfcc@ejournal.net 
Prof. Mohamed Othman
Department of Communication Technology and Network Universiti Putra Malaysia, Malaysia
It is my honor to be the editor-in-chief of IJFCC. The journal publishes good papers in the field of future computer and communication. Hopefully, IJFCC will become a recognized journal among the readers in the filed of future computer and communication.
IJFCC 2018 Vol.7(2): 26-31 ISSN: 2010-3751
DOI: 10.18178/ijfcc.2018.7.1.515

Loose Hand Gesture Recognition Based on Relational Features Using a Depth Sensor

Chen-Ming Chang and Din-Chang Tseng
Abstract—Hand gesture recognition (HGR) in real-time and with precision has become an important research topic. In this article, a loose hand gesture recognition (LHGR) system based on relational features using a depth sensor is implemented, which not only maintains an impressive accuracy in real-time processing but also enables the user to use loose gestures. HGR can usually be divided into three stages: hand detection, hand feature extraction, and gesture classification. However, the method we propose has been useful in improving all the stages of HGR. In the hand detection stage, we propose a ROI dynamic estimation method and a wrist-cutting method that conform to the characteristics of a human hand. In the feature extraction stage, we use the more reliable relational features which are constructed by local features, global features, and depth coding. In the gesture classification stage, we use three layers of classifiers including finger counting, finger name matching, and coding comparison; these layers are used to classify 16 kinds of hand gestures. In the end, the final output is adjusted by an adaptive decision. The average processing speed per frame is 38.6 ms. Using our method has resulted in an average accuracy of standard gestures of about 98.29%, and an average accuracy of loose gestures of about 88.32%. In summary, our LHGR system can robustly classify hand gestures and still achieve acceptable results for loose gestures.

Index Terms—Computer vision, hand gesture recognition, human-computer interaction, image processing, kinect.

Chen-Ming Chang and Din-Chang Tseng are with the Institute of Computer Science and Information Engineering, National Central University, Jhongli, Taiwan 32001 (e-mail: 985402003@cc.ncu.edu.tw, tsengdc@ip.csie.ncu.edu.tw).


Cite: Chen-Ming Chang and Din-Chang Tseng, "Loose Hand Gesture Recognition Based on Relational Features Using a Depth Sensor," International Journal of Future Computer and Communication vol. 7, no. 2, pp. 26-31, 2018.

Copyright © 2008-2018. International Journal of Future Computer and Communication. All rights reserved.
E-mail: ijfcc@ejournal.net