Development of a 3D User Interface using Kinect

OData support
Supervisor:
Dr. Vajda Ferenc
Department of Control Engineering and Information Technology

The User Interface is the most important connection between man and machine. There are more and more research aiming to reduce the use of physical controllers, like the mouse and keyboard. A new way of rethinking the controlling is the Natural User Interface method, where the goal is to make controlling the computer as natural as possible. During my thesis I was working on a system like this. I had the chance to join an ongoing project, where we could give instructions to the 3D display system through a gesture recognition module. Our goal was to create a virtual workspace which the user can control with natural hand movements.

I got to know the Microsoft Kinect 3D sensor, different related frameworks, and similar projects.

My task was to expand the gesture recognition module. Beside the already usable basic gestures like swipes to the sides or up and down, I implemented the recognition of diagonal hand movements and the recognition of the grab gesture. Using this it is possible for the user to select and grab some elements on the UI, and with two handed grabs it is even possible to rotate certain virtual objects. I created a third program module to behave like a central platform, so more than one input device layer or display layer could connect to it, and communicate with each other.

I was working with my research partner Bálint Pusztai, who was developing the display module. We knew that in systems like this the most important part is the user experience and that it’s fun and easy to use.

Who wouldn’t want to browse pictures in the way we saw it in Minority Report, or for a most recent example like Tony Stark in the movie Ironman, controlling a holographic workspace using gestures, grab some virtual objects and move it with a slight movement of the hand?

With researches like ours in the future kitchens the cook doesn’t have to touch the cookbook with dirty hands just have to motion to the sensor and it changes the page. There is big anticipation for these kind of projects in the medical technology, so that the operating doctors don’t have to touch their computers, just tell it what to search for and be able to browse through the results with swipes in the air.

Downloads

Please sign in to download the files of this thesis.