The principle of autonomous robots is one of the most researched areas of our day. There are a lot of guesses about when vehicles are put on the roads that do not require a driver at all. To make this vision real, many tasks need to be solved, for example, detection of a given object, or the question of path planning. In my thesis I deal one of these challenges, estimating the robot’s own orientation and position. Due to the modern technique a lot of sensors are available, so there are several methods for solving this problem.
My thesis uses an accelerometer, a magnetometer and a gyroscope of a smartphone to solve the task mentioned in the title. The sensors are not perfect, so the signals need to be preprocessed. In my work I will examine what information can be extracted from each sensor individually. Each process has its own advantages and disadvantages. In order to increase accuracy and reliability, it is worth combining the result of different methods, using sensor fusion. The algorithm I choose is less complex than the widely used Kalman-filter, so it can be implemented on a low-performance microcontroller.
After reading the information provided here, the reader gets a comprehensive view about the problem and a possibile solution to it. Furthermore, it becomes clear why the presented method cannot be applied where precision is a major factor.
In the last chapter of my thesis I present some techniques that can be used to achieve a more accurate result. So, the process can be used in precision-critical systems as well.