Psychology Research News

“Brett M. Kavanaugh, Associate Justice of the Supreme Court of the United States. This is a screenshot taken during the opening statement made during the sexual assault hearing, on September 27, 2018. The video was produced by C-SPAN.”

Interpersonal communication is a complex process that involves the production and perception of social signals using different expressive modalities (auditory, visual, etc.). Investigating the relationships between different production modalities is essential to understand the subtleties of efficient communication.

With this in mind, BA Psychology student Savannah Sweeting conducted a case study looking at the relationship between verbal and nonverbal signals (facial behavior), as well as the effect of social context on emotional cues. Using the behavior analysis tools of the CanBeLab, she analyzed sections of Brett M. Kavanaugh’s sexual assault hearing in front of the US Senate Judiciary Committee.

The analysis revealed a greater variety of facial expressions during the interactive questioning segment in comparison to the prepared and monologue-like opening statement. During the interactive follow-up questioning, different patterns of expression were observed, depending on the political party of the addressee. Although Mr. Kavanaugh appeared to show more intense facial expressions towards republican senators, these expressions would be more commonly labelled as negative. A larger diversity of facial expression was displayed towards democrat senators, the majority of which would also be labelled as being negative. No particular association was found between facial behavior and speech acts.

All in all, this research shows that facial behavior differs depending on the social context, with a more intense facial activity during interactive, direct, and unprepared speech. In addition, this study shows that the diversity of expressions may depend on the social group of the addressee. Further research is still needed to understand the relationship between speech acts and facial behavior. This work was supervised by Dr. Marc Mehu.