research theme cognitive neuropsychology

Cognitive Neuropsychology

The research projects that we have brought together under the heading Cognitive Neuropsychology concern the relationship between biological processes on the one hand and cognitive and / or emotional processes on the other. How can biological mechanisms, often of the brain, contribute to the understanding of cognition and emotions, and how can they help in the case of disorders?

We exclusively conduct research in humans, in both clinical and non-clinical populations, and across the lifespan, from infants and young children to old age. We use a variety of modern measurement techniques, from advanced behavioral research to electroencephalography (EEG) and functional magnetic resonance imaging (fMRI).

Below you will find a selection of current research.

How does the brain combine information from various senses into unified percepts?

As we go about our daily lives, our brain is constantly exposed to a vast amount of sensory information. While being in a crowded bar, for example, our brain receives input from various sources via our senses. Entering a bar might feel overwhelming at first, but after a short while, we grow accustomed to the changes in our environment. Despite all the distractions and background noise, we are still able to understand what our friends are saying by ‘tuning-in’ to the sound of their voice and observing their visual articulatory movements. Without this lipread information, however, it is very difficult to have a conversation. Binding auditory and visual speech signals into unified percepts thus greatly enhances the ability to comprehend speech, especially under suboptimal listening conditions. This process of combining sensory information from various senses into unified percepts is commonly referred to as multisensory integration.

The main aim of this project is to unravel the underlying mechanisms of multisensory integration through examination of behavioral and neural correlates of multisensory processing, with a particular focus on audiovisual information.

Team

Publications

The anticipating brain: predictive coding in sensory processing

The way we perceive the world around us is not only based on information that we receive through our senses, but is also shaped by our past experiences. A contemporary theoretical framework that describes the processing and integration of sensory information and previous experience, commonly referred to as predictive coding, assumes that our brain continuously generates an internal predictive model of the world around us based on information we receive through our senses and events that we have experienced in the past. This internal predictive model enables us to ‘make sense’ of the world around us, and ensures that our cognitive resources are primarily allocated to novel or otherwise relevant information.

The main aim of this project is to unravel the underlying mechanisms of the ability to predict upcoming sensory stimulation by examining the behavioural and neural correlates of predictive coding in sensory processing, with a particular focus on multisensory signals.

Team

Publications

Sensory processing and predictive coding in autism spectrum disorder

Being able to predict what we are about to see, hear, touch, smell and taste enables us to exhibit appropriate behavioural responses that are crucial for effective engagement and (social) interaction with the world around us. For example, when we are about to enter a busy restaurant that serves seafood, we typically expect the smell of salty fish and hear the sound of people talking - which is why we are usually not surprised if we actually smell and hear those things once we are inside the restaurant.

Failing to accurately predict upcoming sensory information may lead to atypical behavioural responses to sensory stimulation, including hypo- and hyperresponsiveness. Both these symptoms are commonly seen in autism spectrum disorder (ASD). Recently, we have shown that individuals with ASD may have alterations in the ability to anticipate upcoming sensory stimulation. Understanding the neural basis of these alterations in predictive coding in ASD may be a fundamental part of the explanation why individuals with ASD often struggle with social communication and interaction with their environment.

The main aim of this project is to capture the underlying mechanisms of sensory anticipation in ASD with objective behavioural and electrophysiological (EEG) biomarkers

Team

External partners

  • Mart Eussen (Yulius Mental Health)

Publications

It’s your turn! Unravelling the cognitive and neural processes of conversations in neurotypical and autism spectrum disorder populations

In conversations, people seem to intuitively know when to take their turn. Listeners use lexical, semantic, and syntactic cues to predict the end of the turn of a speaker. Switching between speakers is usually very rapid (~200 ms) and notably shorter than the production of a single word (~600 ms). In the literature, it is hypothesized that these smooth transitions between turns are only possible when speech comprehension and speech production overlap. However, little is known about the cognitive and neural processes that are involved in conversations. The main question of the project is: what are the brain mechanisms underlying these cognitive processes during conversations? We will apply a technique called hyperscanning, in which the electrical brain activity (electroencephalography: EEG) of two speakers in conversation is measured simultaneously. The first phase of the project will start in fall of 2020 and will delineate the neural substrates underlying the different speech comprehension and speech production stages in neurotypical individuals. The second phase studies the behavioral and neural correlates of conversations in individuals with autism spectrum disorder (ASD). One of the core symptoms of ASD is impaired social communication. By using actual conversations, we will investigate these deficiencies in highly natural and ecologically valid settings.

Team

What did you say? Neural mechanisms of boosting of speech adaptation.

Human speech is sometimes difficult to understand due to background noise, an unfamiliar accent of the speaker, or poor quality of the speech signal itself. However, listeners quickly adapt to this situation because there is often other information available that informs the listeners what the intended spoken message should be. This ‘other’ information might be lipread speech if the speaker can be seen, or in a movie it might consist of subtitles that can be read while speech is heard. Lexical constraints are also important in disambiguating the intended message. For example, a sound that could be heard as either /g/ or /k/ is perceived as /g/ when followed by ‘ift’ but perceived as /k/ when followed by ‘iss’ because only ‘gift’ and ‘kiss’ are legal words in English (Ganong, 1980). In this project, we hypothesize that these extra information sources – lipread speech, written text, and lexical information – could all provide a basis for determining the discrepancy between the intended and actual speech signal that guides adaptive changes in speech perception. However, until now, their effectiveness has never been directly compared. We will therefore examined whether lipread speech, written text, and lexical information differ in their efficacy boosting adaptation to distorted speech.

Team

Publications

  • JJ Stekelenburg, M Keetels, J Vroomen (2018) Multisensory integration of speech sounds with letters vs. visual speech: only visual speech induces the mismatch negativity. European Journal of Neuroscience 47 (9), 1135-1145
Spontaneous facial mimicry in people with autistic-like traits

Faces play an important role in human interaction. In a natural conversation, there is often facial mimicry which can be described as the unconscious and unintentional copying of another person’s facial expressions. Past research showed people with autism (ASD) tend to engage less in spontaneous facial mimicry compared to typically developing (TD) individuals (McIntosh et al., 2006; Oberman, Winkielman, & Ramachandran, 2009). Recent research also suggests that autistic-like traits are continuously distributed throughout the population. The Adult Autism Spectrum Quotient (AQ, Baron-Cohen, Wheelwright, Skinner, Martin, & Clubley, 2001) thereby functions as a measure of symptoms related to Autism Spectrum conditions in TD adults. In this project we will examine whether TD individuals who either score high or low on AQ-measures differ in their spontaneous facial mimicry using a test developed by Mui et al (2018).

Team

Publications

Speech Perception in Context

Human speech perception is primarily auditory in nature. However, spoken language is often ambiguous or degraded, and our brain relies on available context to support correct perception of the spoken input. We investigate how context such as available visual information (like seeing text or lip-movements), lexical information (our internal knowledge about existing words in a language) and emotional prosody (is a person happy or sad?) influence speech perception on a behavioral level (what do you hear?) and modulate the underlying neurophysiological correlates of auditory speech processing. We are interested in healthy populations, as well as people who suffer from a developmental disorder such as Dyslexia, or display other problems that are linked to speech/language perception.

Project duration: ongoing

Team

Publications

  • Bourguignon, M., Baart, M., Kapnoula, E. C., & Molinaro, N. (2020). Lip-reading enables the brain to synthesize auditory features of unknown silent speech. The Journal of Neuroscience, 40, 1053- 1065.
  • López Zunini, R. A., Baart, M., Samuel, A. G., & Armstrong, B. C. (2020). Lexical access versus lexical decision processes for auditory, visual, and audiovisual items: Insights from behavioral and neural measures. Neuropsychologia, 137, 107305
  • Lindborg, A., Baart, M., Stekelenburg, J., J., Vroomen, J., & Andersen, T. S. (2019). Speech-specific audiovisual integration modulates induced theta-band oscillations. PLOS ONE: e0219744.
  • S.Faezeh PourHasemi, Martijn Baart, Jean Vroomen. (2019,12,20). ” Auditory learning of noise-vocoded speech by lip-read information: Does reading skill matter?” [Poster presentation]. The 17th NVP Winter Conference on BRAIN & COGNITION. Egmond aan Zee, The Netherlands.
Decoding emotions

Emotions play a central role in almost all aspects of human life. For instance, they influence our decision-making, how we deal with setbacks, and how satisfied we are with our lives. Our research aims to decode and classify categories of emotions from brain activity. This means that we let participants experience emotions while we measure their brain activity through EEG. We then look for patterns that these responses have in common for the same emotion, and that are different for other emotions.

In order to trigger strong and lifelike emotional experiences, we make use of Virtual Reality. To detect the commonalities and differences between emotion responses we use state of the art machine learning techniques.

In this project, we study brain activity of healthy humans. Knowledge that is acquired can then be applied in detecting deviations from normalcy in people who experience problems of some sorts. It can also help in developing treatments, and assessing effectiveness of those treatments.

Team

Heart rate and facial EMG indices of antisocial behavior and psychopathic traits in children, adolescents, or adults

Oppositional defiant disorder (ODD) and conduct disorder (CD) are the most prevalent psychiatric disorders in children and adolescents. The Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR) defines ODD as a recurrent pattern of negativistic, defiant, disobedient, and hostile behavior towards authority figures. CD is a more severe type of disorder, characterized by a repetitive and persistent pattern of behavior in which the basic rights of others or major age-appropriate societal norms or rules are violated. ODD may be a developmental precursor of CD in late childhood and adolescence, which, in turn, may be a developmental antecedent of antisocial personality disorder (APD) at adult age. Callous-unemotional (CU) traits identify an important subgroup of antisocial youths in forensic, clinical, and community samples. Children and adolescents with CU traits show particular high rates of conduct problems, delinquency, and police contacts. CU traits are an extension of the interpersonal affective dimension of adult psychopathy—a special case of APD—and include low emotional responsiveness, particularly low fearfulness,  and a lack of empathy, guilt, or remorse. We have shown that ODD, CD, and particularly CU traits are associated with abnormal heart rate and facial EMG responses to emotional scenes involving other people which normally elicit empathy. We have also found that CU traits are associated with subnormal spontaneous facial mimicry responses to emotional facial expressions by others.

Heart rate and facial EMG responses are easily to record during experimental and real-life situations and may have diagnostic value. In particular spontaneous facial EMG responses may give insight in brain processes which are hardly or not accessible using standard neuroimaging measures like fMRI due to insufficient temporal resolution of such measures. Using heart rate and EMG measures we are currently extending our research from youths to adult criminal psychopathic in-patients. In this study we try to evaluate whether treatment has positive effects as apparent from (1) heart rate and facial EMG responses to empathy-eliciting movies, and (2) facial mimicry of dynamic emotional facial expressions.

Team

  • Minet de Wied (Department of Youth & Family, Faculty of Social and Behavioral Sciences, Utrecht University, Utrecht, The Netherlands)
  • Anton van Boxtel
  • Ronald J.P. Rijnders, (Netherlands Institute for Forensic Psychiatry and Psychology, Utrecht, The Netherlands)

Publications

  • van Boxtel, A., Zaalberg, R. & de Wied, M. Reduced facial mimicry responses to dynamic emotional facial expressions in male adolescents with disruptive behavior disorders and callous-unemotional traits. Submitted for publication.
  • de Wied, M., Meeus, W., & van Boxtel, A. Disruptive behavior disorders and psychopathic traits in adolescents: Empathy-related responses to witnessing animal distress. Journal of Psychopathology and Behavioral Assessment, under revision.
  • van der Graaff, J., Meeus, W., de Wied, M., van Boxtel, A., van Lier, P., & Branje, S. (2016). Respiratory sinus arrhythmia moderates the relation between parent-adolescent relationship quality and adolescents' social adjustment. Journal of Abnormal Child Psychology, 44, 269-281.
  • van der Graaff, J., Meeus, W., de Wied, M., van Boxtel, A., van Lier, P.A.C., Koot, H.M., Branje, S. (2016). Motor, affective and cognitive empathy in adolescence: interrelations between facial electromyography and self-reported trait and state measures. Cognition and Emotion, 30, 745-761.
  • de Wied, M., van Boxtel, A., Matthys, W., & Meeus, M. (2012). Verbal, facial and autonomic responses to empathy-eliciting film clips by disruptive male adolescents with high versus low callous-unemotional traits. Journal of Abnormal Child Psychology, 40, 211-223.
  • de Wied, M., Gispen-de Wied, C., & van Boxtel, A. (2010). Empathy dysfunction in children and adolescents with disruptive behavior disorders. European Journal of Pharmacology, 626, 97-103.
  • de Wied, M., van Boxtel, A., Posthumus, J.A., Goudena, P.P., & Matthys, W. (2009). Facial EMG and heart rate responses to emotion-inducing film clips in boys with disruptive behavior disorders. Psychophysiology, 46, 996-1004.
  • de Wied, M., van Boxtel, A., Zaalberg, R., Goudena, P.P., & Matthys, W. (2006). Facial EMG responses to dynamic emotional facial expressions in boys with disruptive behavior disorders. Journal of Psychiatric Research, 40, 112-121.
Reading a text: effects of experienced emotion, mental simulation, and moral evaluation as indicated by facial EMG responses

When someone is reading a story about another person, like a novel or a news report, the reader may adopt a variety of imaginary social or emotional positions relative to the protagonist. Depending on this attitude, quite different cognitive or emotional responses may come up in the reader, like agreement, empathy, sympathy, being moved, compassion, admiration, aversion, jealousy, Schadenfreude, ingroup versus outgroup sentiments, etc. On the basis of overt voluntary responses like verbal reports, it is difficult to get insight in these complex subjective experiences. Overt voluntary responses may be unreliable, rationalized, socially desirable, or do not sufficiently reflect quick changes in the reader's subjective experience. Measuring spontaneous EMG responses of specific facial muscles, which were earlier demonstrated to reliably reflect cognitive or emotional processes, may give insight in subjective experiences during stories since these responses can hardly be voluntarily suppressed and occur with short latency.

These facial EMG measures may give rather specific insight in internal cognitive and emotional processes which may occur when someone is reading a text, the more so as spontaneous activities of certain facial muscles were demonstrated to have high reliability and validity regarding such processes. The relevance of this project is primarily scientific.

Team

  • Björn ’t Hart (Utrecht Institute of Linguistics OTS, Utrecht University, Utrecht, The Netherlands)
  • Marijn E. Struiksma (Utrecht Institute of Linguistics OTS, Utrecht University, Utrecht, The Netherlands)
  • Jos J.A. van Berkum (Utrecht Institute of Linguistics OTS, Utrecht University, Utrecht, The Netherlands)
  • Anton van Boxtel

Publications

  • 't Hart, B., Struiksma, M.E., van Boxtel, A., & van Berkum, J.J.A. Reading about us and them: Moral and minimal group effects on language-induced emotion. Manuscript under revision.
  • 't Hart, B., Struiksma, M.E., van Boxtel, A., & van Berkum, J.J.A. (2019). Tracking affective language comprehension: simulating and evaluating character affect in morally loaded narratives. Frontiers in Psychology, 10, 318.