Brain-computer interface (BCI) refers to the technology that bridges the human brain and external world by translating neural responses into action commands for computers and machines.
dr. Maryam AlimardaniPrincipal Investigator
In the BCI lab, we collect brain signals from human subjects (e.g., using EEG sensors), develop AI classifiers that predict human’s intentions (e.g., to move or select) or mental states (e.g., emotions, attention, etc.) from their brain signals and then use the predictive model in human-technology interaction scenarios to provide adaptive feedback to the user.
By integrating BCI systems in human-technology interaction scenarios, we examine how the user experience (UX) changes for better and how we can augment their performance on different tasks.
For example, by looking at pilots' brain activity in a virtual reality (VR) training simulator, we investigate the possibility of a BCI system that can monitor their workload and improve their learning performance on demanding flight maneuvering tasks.
We also implement BCI systems in human-robot interaction experiments to enable real-time monitoring of task engagement during robot-assisted learning tasks.
Some BCI systems (e.g., motor imagery BCIs) require extensive user training before they can be employed effectively by the user. Using immersive virtual reality environments, we investigate the impact of embodiment and gamification in expediting BCI user training.
- MasterMinds: Neurotechnology and Virtual Reality in Aviation Training
- User Training in Motor Imagery BCIs
- Neuradaptive human-robot interaction
- Royal Netherlands Air Force
- multiSIM v.b.
- Unravel Research