Hand gesture recognition is one of the very active research areas in the Computer Vision field. It provides the ease to interact with machines without using any extra device and if the users don’t have much technical knowledge about the system, they still will be able to use the system with their normal hands. Gestures communicate the meaning of statements said by the human being. They come naturally with the words to help the receiver to understand the communication. It allows individuals to communicate feelings and thoughts with different emotions with words or without words.
This paper presents a parallel implementation of hand gesture recognition. Gestures made by human beings can be any but few have a special meaning. Human hand can have movement in any direction and can bend to any angle in all available coordinates.
This hand gesture recognition algorithm is based on a chapter from the book by K. Kraiss and uses the AForge. Net framework for capturing frames from a webcam in real time. Using a webcam frames are captured, processed for detection of the hand, features extracted and using the features the gestures are identified. There are 3 types of gestures that can be recognized: stop, right and left as seen in figure 1. The program was designed in C# with .Net Framework 4 and uses it’s threading support to implement a parallel version of the algorithm.
Different applications of hand gesture recognition have been implemented in different domains from simply game inputs to critical applications. Hand gesture recognition is the natural way to interact with vision enabled computers and other machines. This paper primarily focused on the study of work done in the area of natural hand gesture recognition using Computer VisionTechniques. In the future we will work in the area of individual finger position bending detection and movements, as work done in this area is very few. Most researchers worked with full hand position detection or the fingertip position to write virtual words.
Hai T. Nguyen, Binh A. Nguyen, Ngoc T. Le, Hanh T. Pham and Giao N.Pham