With the advancement of the processors and parallel computing performance the role of virtual reality systems grows more and more important. My task was to embed optical sensors in a virtual environment. The goal is to provide a physical interface to the user to communicate with the virtual environment. For this task I present a solution applying optical tracking to create interaction tools. In the thesis I describe the development of a user interface that provides a variety of easily configurable input devices. The input device is equipped with passive markers which provides trackability after a learning process. The 3D position and orientation data can be used to control the processes of the virtual reality system. The thesis describes a subgraph isomorphism based tracking procedure, that consists of three main steps. The first step is the identification, which includes extraction of two-dimensional image features and find correspondence with the input device's model characteristics. The second step is to estimate the position and orientation. Finally, the third step in the filtering and prediction, which reduce the number of measurement errors caused by noise and delay. The thesis contains the description of the procedures used during implementation. For the implementation I used two Logitech C270 HD Webcams and the OpenCV computer vision library in C++ language.