TY - CONF
T1 - Processing emotional expressions in the infant brain: do emotions affect each other?
AU - van den Boomen, C.
AU - Munsters, N.M.
AU - Kemner, C.
PY - 2016
Y1 - 2016
N2 - Background: Understanding of facial emotional expressions is key for an infant’s social and cognitive development. Processing of emotions changes with age: behavioral and electroencephalography (EEG) studies revealed that infants learn to discriminate emotions between 5 and 7 months of age, and that this process refines throughout childhood (Leppanen & Nelson, 2008). However, infant EEG studies typically investigate discrimination between two emotions, while those in older children and adults compare multiple ones (Batty & Taylor, 2006; Grossmann & Johnson, 2007). It is unknown whether the number of type of emotions influences discrimination between two emotions. Whether one can discriminate an emotion could possibly depend on which other emotions are presented in the task. The current study suggests that this indeed is the case in infants. We conclude this based on a comparison between two experiments, which originally aimed to study how basic visual information (i.e. details related to higher spatial frequencies; HSF; or global information related to lower spatial frequencies; LSF) drives emotion discrimination. Methods: In both experiments (first: N= 23; second: N=55), infants aged 9-10 months viewed faces with neutral or fearful emotions. The second experiment included an additional condition containing happy emotions. All faces were filtered as to contain HSF or LSF information (Figure 1). Using EEG, we studied whether the N290 and P400 peak amplitudes (face sensitive peaks) differed between the emotions, separately for LSF and HSF filtered faces. Results and discussion: In both experiments, emotion and visual information (LSF; HSF) interactively modulated the N290 (first: p=.005; second: p=.019), but not the P400 peak amplitude (first: p>.1; second: p>.1). However, the direction of interaction at the N290 peak differed between experiments (Figure 2). For global LSF information, emotions modulated the N290 amplitude in experiment 1 (fearful higher amplitudes than neutral) but not experiment 2. For detailed HSF faces, opposite effects were found in experiment 1 (fearful lower amplitudes than neutral), than experiment 2 (fearful higher than neutral and happy). As only processing of neutral faces differed between experiments, the difference seems to be due to neutral emotion processing. Possibly, neutral emotions are evaluated differently in context of only fearful, than both fearful and happy emotions. These results suggest that processing of an emotion depends on the emotions presented in the context. It is thus important to take into account the type of presented emotions when evaluating results on discrimination between emotions or designing new experiments within infant research.
AB - Background: Understanding of facial emotional expressions is key for an infant’s social and cognitive development. Processing of emotions changes with age: behavioral and electroencephalography (EEG) studies revealed that infants learn to discriminate emotions between 5 and 7 months of age, and that this process refines throughout childhood (Leppanen & Nelson, 2008). However, infant EEG studies typically investigate discrimination between two emotions, while those in older children and adults compare multiple ones (Batty & Taylor, 2006; Grossmann & Johnson, 2007). It is unknown whether the number of type of emotions influences discrimination between two emotions. Whether one can discriminate an emotion could possibly depend on which other emotions are presented in the task. The current study suggests that this indeed is the case in infants. We conclude this based on a comparison between two experiments, which originally aimed to study how basic visual information (i.e. details related to higher spatial frequencies; HSF; or global information related to lower spatial frequencies; LSF) drives emotion discrimination. Methods: In both experiments (first: N= 23; second: N=55), infants aged 9-10 months viewed faces with neutral or fearful emotions. The second experiment included an additional condition containing happy emotions. All faces were filtered as to contain HSF or LSF information (Figure 1). Using EEG, we studied whether the N290 and P400 peak amplitudes (face sensitive peaks) differed between the emotions, separately for LSF and HSF filtered faces. Results and discussion: In both experiments, emotion and visual information (LSF; HSF) interactively modulated the N290 (first: p=.005; second: p=.019), but not the P400 peak amplitude (first: p>.1; second: p>.1). However, the direction of interaction at the N290 peak differed between experiments (Figure 2). For global LSF information, emotions modulated the N290 amplitude in experiment 1 (fearful higher amplitudes than neutral) but not experiment 2. For detailed HSF faces, opposite effects were found in experiment 1 (fearful lower amplitudes than neutral), than experiment 2 (fearful higher than neutral and happy). As only processing of neutral faces differed between experiments, the difference seems to be due to neutral emotion processing. Possibly, neutral emotions are evaluated differently in context of only fearful, than both fearful and happy emotions. These results suggest that processing of an emotion depends on the emotions presented in the context. It is thus important to take into account the type of presented emotions when evaluating results on discrimination between emotions or designing new experiments within infant research.
M3 - Other
ER -