Hand gestures are frequently used in daily life and they can be very expressive forms of communication. The idea of the development of a hand gesture based input interface to the computer is a natural idea.
The input interface, which is presented here, consists a low-cost webcam and a software, what analyzes the video data coming from the camrea. It starts to track the position of the hand if it senses a special gesture. Two different approach was tested to achive the goal of the hand tracking. The first is the Virtual Touchpad (VTP), the second is the Virtual Touchscreen (VTS). The VTP uses an A4 white board to segment the hand and the VTS detects the hand in the air, if it is close enough to the camera. Currently the resolution of these devices and the robustness of them aren’t very good, but they work and show the potential benefits of these technologies and they enable to control mouse pointer on the screen via simple gestures.
The demonstration application was created in C++ using the OpenCV and the cvBlobs libraries. The system uses color-based tracking and Chamfer matching and blob analysis. The detailed description of the project and the test results are included in the thesis.