Executive control over the production of facial expressions of emotion
Social Psychology, Industrial and Organisational Psychology
Final Report Abstract
The present project investigated the motor control of facial expressions, scoring participants’ performance with automated analyses of facial expressions with computer software. We established satisfactory reliability of the measurements of facial expressions with the software FACET and convergence with RTs measured with EMG. These results encourage using this technique to investigate the motor control of facial expressions, however, we also observed some limitations for the software, for instance the presence of unexpected artifacts. We measured behavioral correlates of: a) the reprogramming (quickly switching between expressions) with the response-priming task; b) the top-down control to ignore visual information incongruent with the facial expression to be produced with the Stroop task; and c) the inhibitory control of facial expression with the go/no-go task. The contingent negative variation (CNV) is confirmed as ERP correlate of the motor preparation of facial expressions. Contrary to our expectation, the N2 component did not appear consistently across tasks as correlate of the motor control of facial expressions. The P3 component was associated with the reprograming and inhibition of facial expressions. One interesting finding is improved motor control of facial expressions of negative affect (anger, disgust, fear), relative to positive (smiles) and neutral (jaw drops) expressions. This better control associated with negative affect appeared across tasks measuring reprogramming, posing facial expressions while ignoring visually presented emotional stimuli, and for the volitional inhibitory control. Data from the P3 component indicated that this efficient control relies on great recruitment of brain systems dedicated to motor control. The CVN and the N2 components revealed signs of emotion specificity in some tasks, but overall results regarding these components were mixed. Regarding the automaticity, so far the data do not consistently support the hypothesis that affective stimuli, including pictures showing facial expressions, and images of aversive and appetitive stimuli, elicit automatic reactions that interfere with control. Providing perceptual cues as examples for imitation helped participants to reproduce expressions more accurately but did not affect the speed of reprogramming. Results did not confirm the hypothesis that providing an affiliative motive (i.e. attractiveness) would modulate the control over the facial expression. However, this finding is not conclusive because ratings did not confirm the intended experimental manipulation in attractiveness. Interestingly, data from one experiment showed that smiles were more difficult to inhibit during face-to-face dyadic interaction when direct eye contact was established.
Publications
- (2016). Recognition thresholds for static and dynamic emotional faces with variable expressive intensity. Emotion 16(8), 1186-1200
Calvo, M.G., Avero, P., Fernández-Martín, A., & Recio, G.
(See online at https://doi.org/10.1037/emo0000192) - (2018) Discrimination between smiles: Human observers vs. automated face analysis. Acta Psychologica, 187, 19-29
Del Líbano, M., Calvo, M.G., Fernández-Martín, A., & Recio, G.
(See online at https://doi.org/10.1016/j.actpsy.2018.04.019)