News Release

How synchronization supports social interactions

Taking turns during conversations may help coordinate verbal and nonverbal cues

Peer-Reviewed Publication

PLOS

From unimodal to multimodal dynamics of verbal and nonverbal cues during unstructured conversation

image: 

Schematic diagram of the relationship between intrapersonal and interpersonal synchronization, whether unimodal or multimodal. A) and B) represent intrapersonal synchronization among the modalities of a single speaker. A) Full blue arrows highlight the unimodal relationship between the gestures generated by a single individual (i.e., head vs. head, head vs. wrist). B) Dashed blue arrows highlight the multimodal relationship between the voice and the gesture produced by a single individual (i.e., head vs. voice; wrist vs. voice). C) and D) represent the interpersonal synchronization between the modalities of speaker A and speaker B. C) Full red arrows highlight the unimodal relationships between the movements of speaker A and speaker B, and between their voices (i.e., head vs. head, head vs. wrist, voice vs. voice). D) Dashed red arrows highlight the multimodal relationships between the voice of speaker A and the gesture of speaker B and inversely (i.e., head vs. voice; wrist vs. voice). 

view more 

Credit: Fauviaux et al., 2024, PLOS ONE, CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/)

Turn-taking dynamics of social interactions are important for speech and gesture synchronization, enabling conversations to proceed efficiently, according to a study published September 25, 2024, in the open-access journal PLOS ONE by Tifenn Fauviaux from the University of Montpellier, France, and colleagues.

Conversations encompass continuous exchanges of verbal and nonverbal information. Previous research has demonstrated that gestures and speech synchronize at the individual level. But few studies have investigated how this phenomenon may unfold between individuals.

To fill this knowledge gap, Fauviaux and colleagues used an online dataset consisting of 14 sessions of two people engaged in unstructured face-to-face conversations during which they were free to talk about specific topics. Each of these sessions contained between one and four discussions, and the conversations lasted from 7 to 15 minutes. The researchers analyzed both audio and motion data, and measured speech and gesture synchronization at different timescales. Specifically, they focused on vocal properties through the speech amplitude envelope and movement properties through head and wrist gestures.

The results supported previous research on speech and gesture coordination at the individual level, revealing synchronization at all timescales of the conversation. That is, there was higher-than-chance synchronization between a given participant’s wrist and head movements, and similar synchronization between these movements and vocal properties.

Extending the literature, the researchers also found that gestures and speech synchronize between individuals. In other words, there was coordination between the voices and the bodies of the two speakers. Taken together, the findings suggest that this type of synchronization of verbal and nonverbal information likely depends on the turn-taking dynamics of conversations.

According to the authors, the study enriches our understanding of behavioral dynamics during social interactions at both the intrapersonal and interpersonal levels, and strengthens knowledge regarding the importance of synchrony between speech and gestures. Future research building on this study could shed light on prosocial behaviors and psychiatric conditions characterized by social deficits.

The authors add: “How do my speech and behaviors influence, or respond to, the speech and behaviors of the person I'm conversing with? This study answers this question by investigating the multimodal dynamic between speech and movements, both at the individual’s level and the dyadic level. Our findings confirm intrapersonal coordination between speech and gestures across all temporal scales. It also suggests that multimodal and interpersonal synchronization may be influenced by the speech channel, particularly the dynamics of turn-taking.”

#####

In your coverage please use this URL to provide access to the freely available article in PLOS ONE: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0309831

Citation: Fauviaux T, Marin L, Parisi M, Schmidt R, Mostafaoui G (2024) From unimodal to multimodal dynamics of verbal and nonverbal cues during unstructured conversation. PLoS ONE 19(9): e0309831. https://doi.org/10.1371/journal.pone.0309831

Author Countries: France, USA

Funding: This study has received funding from the Agence Nationale de la Recherche (ANR) for the project ENHANCER under the Grant agreement number ANR-22-CE17-0036 (https://anr.fr/Projet-ANR-22-CE17-0036). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.