Application of Learning Image Processing Algorithms for Gesture Detection

OData support
Supervisor:
Kovács Viktor
Department of Automation and Applied Informatics

Among computer users appeared a demand for new kinds of user interfaces. Because of this, new kinds of input devices appeared, those are different from the mouse and keyboard combination. Nowadays there is an opportunity to control our computer with hand gestures or even pupil movement. The drawback of these interfaces are the hardware expense, because they require good quality cameras. CozyTap found a solution to this problem, because it uses the built-in camera of the laptop. Only a small and simple hardware is required, a mirror, that reflects the image of the keyboard to the camera. While using, we don’t have to remove our hands from the keyboard even for a second, the cursor control can be activated with a simple hand gesture.

The goal of my work is to extend the system of CozyTap with a new function, with the opportunity to teach new hand gestures to the program, and link these gestures to any custom function. To do this, the existing system provides me with segmented pictures of the left hand of the user, and the coordinates of 9 specific points of the same hand.

In my work I tried to find which machine learning algorithms can be used for this purpose. I chose the two most promising ones, and wrote a program, which reads the necessary data from a text file, trains the algorithm, and tests it with cross validation. I tested the parameters of the algorithm in a wide interval, and examined how the results depend on the quality, quantity and diversity of the training data.

Downloads

Please sign in to download the files of this thesis.