woman with camera

Development of facial expressions in virtual human agents can help healthcare and education

Published: 10th January 2024 Last updated: 25th January 2024

Technology, such as computer graphics and machine learning, has helped to develop virtual agents that closely resemble humans in appearance and show increasingly human-like communicative behaviors. In her dissertation, Julija Vaitonyte conducted under the ViBE (Virtual Humans in the Brabant Economy) project, did research on people’s perceptions of the appearance and behavior of virtual agents.

She wanted to know which features in the face were responsible for perceiving human-likeness. The results showed that especially two facial features mark the difference between a virtual agent and a human being. One feature was the appearance of the skin while the second feature was related to the eyes. If the skin was smooth and the eyes lacked corneal reflections (the white highlights in the eyes), the face was identifiable as an agent-like face rather than a human-like face. 

By better understanding it becomes possible to construct more intuitive agents that can build trust and assist humans in healthcare and education, Vaitonyte states.

In another study summarizing findings from different neurological experiments, she showed that the brain area responsible for processing faces in humans is less active when responding to computer-generated faces compared to real human faces.

By better understanding how humans process virtual agents, it becomes possible to construct more intuitive agents that can build trust and assist humans in healthcare and education, Vaitonyte states.

In her blog Vaitonyte elaborates on virtual agent applications.

Julija Vaitonyte defends her dissertation on January 12, in the Auditorium at 13.30 hours with livestream. The title is: The Face Puzzle: Decoding Human Perception of Digital Agents.