The usage of touch screen devices is getting more and more frequent in everyday life, which transforms software development as well. The touch screen as a new Human-Computer Interface (HCI) opens new dimensions in monitoring users input gestures. Big Data databases can build up from these monitored gestures to reveal hidden information about the user applying data mining techniques. Research projects already investigated some of the possible usage of this method.
Adaptive learning software are applications offer dynamic learning path based on the learner’s individual capabilities and previous knowledge. In order for proper choices software decisions usually based on measured and processed physiological signals.
In this work, I investigated the possible usage of artificial neural networks in order to convert learning applications into adaptive software, based on mined information from input gestures. This method could eliminate the necessity of cumbersome physiological measurements.