Most smartphones and video game consoles nowadays come with built-in sensors that can track the movement of their users. Even so, there are no widespread solutions available for controlling the computer with custom gestures in everyday use.
In this thesis, I lead you through the design and implementation of a motion pattern based human machine interface, from motion sensing to the recognition of previously taught gestures. I created a custom device that should be held in the user's hand, and the control of the computer happens with moving this device. The thesis reviews the characteristics and the operation of the hardware components and the structure of the embedded software running on the microcontroller.
Gesture recognition is a well-known problem in the domain of machine learning, so there is a handful of algorithms at the developers' disposal. I shortly describe them and compare the performance of the most suited ones after their evaluation with real-life test data. As the raw sensor data can not be directly fed into the algorithms, I explain how they can be converted to a usable form.
It is shown that the completed system was able to recognize the gestures with over 85 % success rate during the evaluation, and suggestions are made for improving it to get even better results.