Future interaction between machines and humans requires a high level of awareness from the user's side and automatic communication of the user's commands and mental states to the machines. My research bridges the fields of brain-computer interfaces, robotics, and cognitive science by developing adaptive BCI systems that enhance the human-machine interaction and increase the human cognitive capacities. In the past, I focused on the sense of embodiment that operators experienced during BCI-operation of a humanlike robot and introduced a new neurofeedback training paradigm that could improve their learning of a motor imagery task. Currently, I am working on the development of BCI-controlled robots/avatars that monitor users' brain activity in real time and perform a user-specific therapeutic intervention.