2008-12-15 at 11:00
Conference Room 501, UMR7102, UPMC, Bat B 5th floor, 9, quai st Bernard, 75005 Paris
Brain-Computer interaction based on non-invasive EEG recordings.
The idea of moving robots or prosthetic devices not by manual control, but by mere "thinking" (i.e., the brain activity of human subjects) has fascinated researchers for the last 30 years, but it is only now that first experiments have shown the possibility to do so. Such a kind of brain-computer interface (BCI) is a natural way to augment human capabilities by providing a new interaction link with the outside world and is particularly relevant as an aid for physically disabled people. In this talk I will review our work on non-invasive asynchronous BCI, with a focus on how brainwaves can be used to directly control robots. Most of the hope for such a possibility comes from invasive approaches that provide detailed single neuron activity; however, it requires surgical implantation of microelectrodes in the brain. For humans, non-invasive systems based on electroencephalogram (EEG) signals are preferable but, until now, have been considered too poor and slow for controlling rapid and complex sequences of movements. Recently we have shown for the first time that online analysis of a few EEG channels, if used in combination with advanced robotics and machine learning techniques, is sufficient for humans to continuously control a mobile robot and a wheelchair. Finally, we discuss current research directions we are pursuing in order to improve the performance and robustness of our BCI system, especially for real-time control of brain-actuated robots. In particular, I'll mention work on recognizing cognitive states that are crucial for interaction.