Developing Virtual Avatars Based on Orientation Sensor Data

OData support
Supervisor:
Dr. Fehér Gábor
Department of Telecommunications and Media Informatics

Abstract

Virtual reality is a new branch of technology, which is continuously updating and upgrading. Thus there are many undiscovered and not-standardized features that this area may offer. Its usability could be beyond imagination, since we can simulate/create visually highly pleasing environments or objects in this virtual world. Also out capabilities are widened. Most people loves experiencing through visuals, rather than reading books or so. Hence this branch should be given a vast attention.

My job’s aim was to assemble a software, which works with orientation sensors, so that it can simulate an avatar, and modulate its transformation via the sensor given data. The used detectors are made up of a gyroscope, a magnetometer and a accelerometer. From the data gather from these parts a microcontroller calibrates the exact rotation of the object in our reality, then creates a quaternion from these information and send it through its radio interface.

For the creation of the software I favored the Unity3D game engine, which has an enormously wide tool kit, to use for the simulation. Also it is a free program. The completed program of mine can connect to a given MQTT broker, which then address the messages to us, taken from the sensors. From these data, the number of sensors is determined, and based on this information the software gives us some configuration options to choose from. Later, the bodypart-sensor relations are built, and can be used to drive an avatar. After the configuration the imported models are listed, and any can be favored by the user. From this point, the recording of movements are prepared, and if that happens, the files generated are stored on a special folder on the computer. By navigation back to the main menu, the recordings can be overviewed and analyzed.

Downloads

Please sign in to download the files of this thesis.