• Jan 05, 2017 News![CFP] 2017 the annual meeting of IJFCC Editorial Board, ICCTD 2017, will be held in Paris, France during March 20-22, 2017.   [Click]
  • Mar 24, 2016 News! IJFCC Vol. 4, No. 4 has been indexed by EI (Inspec).   [Click]
  • Mar 24, 2017 News!Vol.6, No.1 has been published with online version.   [Click]
General Information
    • ISSN: 2010-3751
    • Frequency: Bimonthly (2012-2016); Quarterly (Since 2017)
    • DOI: 10.18178/IJFCC
    • Editor-in-Chief: Prof. Mohamed Othman
    • Executive Editor: Ms. Nancy Y. Liu
    • Abstracting/ Indexing: Google Scholar, Engineering & Technology Digital Library, and Crossref, DOAJ, Electronic Journals LibraryEI (INSPEC, IET).
    • E-mail:  ijfcc@ejournal.net 
Editor-in-chief
Prof. Mohamed Othman
Department of Communication Technology and Network Universiti Putra Malaysia, Malaysia
It is my honor to be the editor-in-chief of IJFCC. The journal publishes good papers in the field of future computer and communication. Hopefully, IJFCC will become a recognized journal among the readers in the filed of future computer and communication.
IJFCC 2012 Vol.1(3): 312-315 ISSN: 2010-3751
DOI: 10.7763/IJFCC.2012.V1.84

The Performance of Eyes and Hand Gesture Translation to Control Body Turn for Paralyzed Patients

Pimpisa Chankumsri, Waranee Chokanunt, and Nunnapad Toadithep

Abstract—This research aims to use the technology of image processing for assisting the Paralyzed Patients to change their posture on bed: turn left, turn right and at the center. Using the patient’s organ that do not paralyzed: fingers and eyes. The picture of actions is captured from web camera, if detect the starting sign, the processes of interpreting the meaning of fingers or eyes are performed. The main principle of fingers detection is, take the input image and subtract from background image, and using the contour algorithm for counting the number of fingers: one, two, and three. In the case of interpreting from the action from eyes, firstly must detect face of patients, using Haar-Like features, and then using Hough Transform for detecting the eye-pupil and interpreting the result. On testing cases of 5 persons, 10 times action for each. We found that, the percentage of accuracy of hand actions is 95.33, the percentage of accuracy of eyes actions is 94.67.

Index Terms—Haar-like feature, finger, eyes, paralyzed patients, body turn.

The authors are with the Department of Computing, Faculty of Science, Silpakorn University, Nakhon Pathom (e-mail: pimpisa540@gmail.com; waranee561@gmail.com; nunnapad@gmail.com).

[PDF]

Cite: Pimpisa Chankumsri, Waranee Chokanunt, and Nunnapad Toadithep, "The Performance of Eyes and Hand Gesture Translation to Control Body Turn for Paralyzed Patients," International Journal of Future Computer and Communication vol. 1, no. 3, pp. 312-315, 2012.

Copyright © 2008-2016. International Journal of Future Computer and Communication. All rights reserved.
E-mail: ijfcc@ejournal.net