Science-fiction books and movies often include characters who have prosthetic limbs. Robotic arms and legs are usually depicted as specialties, granting their user extraordinary capabilities. Even if they are not real, their concept bears a wonderful vision: a future where every lost body part can be replaced and its functionality fully restored. Physical disabilities severely hamper a person’s quality of life, and it is a challenge and a responsibility to exploit the technological opportunities to overcome this issue.
Prosthetic limbs have been developed for a long time by various academic institutes and companies. However, there is still no common solution for the issue of channeling the user’s intentions to a robotic prosthesis. The basic approach to prosthetic limb control is to by some means access the user’s neural signals and extract information which is relevant for controlling the prosthesis. The task is therefore to create a reliable neural-computer interface, which essentially has two parts. The first is the signal acquisition system, which records the user’s relevant neural activity with sufficient quality for processing. The second is the signal processing system, which takes the acquired signals, from those it infers the user’s intentions, and finally produces a control signal directly for the prosthetic limb.
Electroencephalography (EEG) is a possible method for gaining insight into the brain activity. Hence in my graduate studies I delved into examining the feasibility of EEG-based Brain-Computer Interfaces (BCIs). In this thesis I present the methods of EEG signal processing for motor activity detection, a set of software tools developed for BCI design, and an example BCI along with an experiment in a non-laboratory environment.