Intelligent Room Development

OData support
Supervisor:
Dr. Strausz György
Department of Measurement and Information Systems

Nowadays, the development of Intelligent Space applications is gaining more and more popularity The development of the intelligent room is closely connected to intelligent space (iSpace) applications, its goal is to create an information environment that is capable of making the life of human users more comfortable and efficient.

The goal of this thesis work is to design such an intelligent room architecture and to implement it in a physical environment to a certain degree. Thus this work presents the designed system in all details, its state of implementation and possibilities and proposals to its improvement.

The system has a graph-based knowledge base, where the individual nodes denote the concepts known to the iSpace. The nodes are connected to each other via different types of edges, which describe their respective relationship. The nodes themselves are nameless, their identification is done through dictionaries, thus ensuring the language independence of the knowledge base. The knowledge base is divided to an abstract and an instance domain, in the former are stored the general concepts, and in the latter are the known physical instances of the general concepts. The system keeps track of the agents in the room, as well as the services they can provide. The system is able to monitor, comprehend and learn the commands of the human users in the room. It can provide services and information when ordered. Because of the flexible structure of the knowledge base, devices can be added to the system and can be removed from the system anytime, only the knowledge base is needed to be modified, which can be done without any difficulties. The learning of commands is based on a hypothesis-based learning algorithm.

The physical aspects of the system are being implemented in the Intelligent Laboratory of Donát Bánki Institute of Mechatronics and Vehicle Engineering in Óbuda University. The software aspects of the system mentioned above were implemented through multiple personal computers, using multiple cameras. A method that is able to detect human skin regions in the pictures of the cameras in near real time was also implemented, which is the first and one of the most important steps in hand gesture identification and facial features-based emotion extraction.

Downloads

Please sign in to download the files of this thesis.