Multimodal XML based pedestrian navigation for mobile devices

OData support
Supervisor:
Dr. Gyires-Tóth Bálint Pál
Department of Telecommunications and Media Informatics

Today information technology has expanded beyond the possibilities offered by stationary, desktop computers. As a result of the development of mobile devices and networks new IT solutions have appeared in several new areas of life. This development is characterized by integration of different tools and services. While many earlier jobs - positioning, wireless Internet access, camera, media playback – needed specific tools, many of these can be found in most mobile phones today. Positioning plays an important role in my thesis.

There are several options for determining someone’s current position. The best known solution is GPS (Global Positioning System) satellite positioning. It provides sufficiently accurate data for most situations but it is not the best option in all circumstances. It is not usable in closed areas because of the shielding effect of buildings. In such areas it is possible to use Wi-Fi networks and mobile network cell information to determine our position. Recent mobile devices use such technologies. As a result, a new group of services, - location-based services (LBS) - has been created. In these services it is crucial to determine the position of users and use it in some way.

In the present study I introduce the design and implementation of a system providing a special location-based service. This system is a pedestrian navigation application, designed for the Android platform. The application allows users to determine the public transport route to anywhere in Budapest. In addition, the entire public transport timetables can also be accessed and searched. During research and development, special attention was paid for creating a system, which is comfortable for everyone. The currently fashionable touch-screen devices do not provide suitable user interface for all groups of users. Blind and visually impaired users can easily use devices with a physical keyboard because they can feel the buttons; however they face several problems on touch screens.

In my system I present a solution for this problem. I propose a specific solution based on a general theory to let blind and visually impaired users to use touch screen. This solution is based on an XML description based multi-modal user interface. On the one hand this is a speech user interface (SUI) integrated in a graphical user interface (GUI), on the other hand, it means a new approach to the use of graphical items (the system uses the devices’ different modalities: it lets you feel the touch-screen graphic elements using touch and hearing). In mobile environments, everyone can benefit from the use of speech user interfaces. No need to always look on the display of the device; a headset is quite sufficient to manage applications.

Downloads

Please sign in to download the files of this thesis.