Abstract—This research aims to use the technology of image processing for assisting the Paralyzed Patients to change their posture on bed: turn left, turn right and at the center. Using the patient’s organ that do not paralyzed: fingers and eyes. The picture of actions is captured from web camera, if detect the starting sign, the processes of interpreting the meaning of fingers or eyes are performed. The main principle of fingers detection is, take the input image and subtract from background image, and using the contour algorithm for counting the number of fingers: one, two, and three. In the case of interpreting from the action from eyes, firstly must detect face of patients, using Haar-Like features, and then using Hough Transform for detecting the eye-pupil and interpreting the result. On testing cases of 5 persons, 10 times action for each. We found that, the percentage of accuracy of hand actions is 95.33, the percentage of accuracy of eyes actions is 94.67.
Index Terms—Haar-like feature, finger, eyes, paralyzed patients, body turn.
The authors are with the Department of Computing, Faculty of Science, Silpakorn University, Nakhon Pathom (e-mail: pimpisa540@gmail.com; waranee561@gmail.com; nunnapad@gmail.com).
Cite: Pimpisa Chankumsri, Waranee Chokanunt, and Nunnapad Toadithep, "The Performance of Eyes and Hand Gesture Translation to Control Body Turn for Paralyzed Patients," International Journal of Future Computer and Communication vol. 1, no. 3, pp. 312-315, 2012.
Copyright © 2008-2025. International Journal of Future Computer and Communication. All rights reserved.
E-mail: editor@ijfcc.org