Virtual and Interactive Environments for Human Improvement
The interests of this group within the Human-Technology Interaction unit are the design and use of virtual and interactive environments as a medium of observing, interpreting, and potentially improving human cognition and behavior.
We are located in the XR lab and the Game Lab.
What we do:
- Design digital environments that improve human decision-making, health, and well-being
- Design and use multimodal sensing in Virtual & Extended reality applications
- Model users through affect and behavior
- Study user experience and impeding factors, such as motion sickness
- Analyze ethical aspects of artificial intelligence
Our partners and collaborators:
- Aristotle University of Thessaloniki - School of Physical Education and Sport Science
- Breda University of Applied Sciences - Experience Lab & Breda Guardians
- Lily Frank, TU/E - https://www.tue.nl/en/research/researchers/lily-frank
- Malcolm Ryan, Macquarie U. - https://researchers.mq.edu.au/en/persons/malcolm-ryan
- Malene Flensborg Damholdt and Arthur Bran Herbener, Aarhus U. - https://www.au.dk/en/show/person/malenefd@psy.au.dk
Our current projects:
- Project “Flight” - Gamified Motor Learning
- VR Methods of Loci (VRMLoc)
- Digital Interventions Incubator
- Virtual Day
- MasterMinds Project “Decision making in complex environments using serious games”
- SEED Grant: “Video Games and Well-being” with Matti Vuorre
- “Introducing MARC - a Multifunctional Autoregressive Chatbot for AI-delivered mental health interventions” in Aarhus University, Robophilosophy unit
- Starter Grant: “Video Games for Moral Improvement”
Some of our past (related) projects:
- Data2Game (www.data2game.nl) - Player Modelling through player affect
- VR cybersickness and pain
- Memory palace / method of Loci memory training in VR
- VR study with multimodal data recorded, at the Science Museum in London: https://youtu.be/TNGxfcrq-us
- Head-tracked spatial audio in VR
- Design and validation of 3D stimuli libraries
- OCOsense (Development of smart sensing wearables for facial data detection).
- Detection of depression markers from facial data
Some of our publications:
- Mavromoustakos-Blom, P., Bakkes, S., Spronck, P. (2019). Andromeda: A Personalised Crisis Management Training Toolkit. In: Liapis, A., Yannakakis, G., Gentile, M., Ninaus, M. (eds) Games and Learning Alliance. GALA 2019. Lecture Notes in Computer Science, vol 11899. Springer, Cham. https://doi.org/10.1007/978-3-030-34350-7_14
- Paris Mavromoustakos-Blom et al. (2023) "Gamified Motor Learning Through High-Fidelity Sensor Technology," 2023 IEEE 11th International Conference on Serious Games and Applications for Health (SeGAH), Athens, Greece, pp. 1-7, https://doi.org/10.1109/SeGAH57547.2023.10253778
- Paris Mavromoustakos-Blom, Stefan Methorst, Sander Bakkes, Pieter Spronck (2019) Modeling and adjusting in-game difficulty based on facial expression analysis, Entertainment Computing, Volume 31, 100307.
- Michał Klincewicz, (2019) “Robotic Nudges for Moral Improvement through Stoic Practice,” Techné: Research in Philosophy and Technology 23 (3), pp. 425-455.
- Arthur Herbener, Michał Klincewicz, Malene Flensborg Damholdt (2024) “A Narrative Review of the Active Ingredients in Psychotherapy Delivered by Conversational Agents”, Computers in Human Behavior Reports.
- Michał Klincewicz and Lily Frank (2020). “Consequences of unexplainable machine learning for the notions of a trusted doctor and patient autonomy.” Proceedings of the 2nd EXplainable AI in Law Workshop (XAILA 2019) Co-Located with 32nd International Conference on Legal Knowledge and Information Systems (JURIX 2019).
- Gianluca Guglielmo, Michał Klincewicz, Elisabeth Huis in ’t Veld, Pieter Spronck (2024) “Introducing "Sustainable Port": A Serious Game to Study Decision-Making in Port-Related Environments”, 2024 IEEE Gaming, Entertainment, and Media Conference.
- Mavridou, I. & Nduka, C. (2022) A Virtual Reality platform for the objective measurement of emotional state, In Applied Virtual Reality in Healthcare: Case Studies.
- Mavridou, I., Balaguer-Ballester, E., Nduka, C., Seiss, E. (2023) A reliable and robust online validation method for creating a novel 3D Affective Virtual Environment and Event Library (AVEL), PLoS ONE 18(4): e0278065.https://doi.org/10.1371/journal.pone.0278065
- Archer J., Mavridou I.*, Stankoski S, Broulidakis J, Cleal A, Walas P, Fatoorechi M., Gjoreski H, Nduka C. 2023. OCOsenseTM smart glasses for analyzing facial expressions using optomyographic sensors. IEEE Pervasive Computing. https://doi.org/10.1109/MPRV.2023.3276471
- Warp, R., Zhu, M., Kiprijanovska, I., Wiesler, J., Stafford, S., & Mavridou, I. (2022, May). Moved By Sound: How head-tracked spatial audio affects autonomic emotional state and immersion-driven auditory orienting response in VR Environments. In Audio Engineering Society Convention 151. AES.http://www.aes.org/e-lib/browse.cfm?elib=21703