Standard Standard

Auditory dyadic interactions through the “eye” of the social brain: How visual is the posterior STS interaction region? / Landsiedel, Julia; Koldewyn, Kami.
In: Imaging Neuroscience, Vol. 1, No. 1, 10.08.2023, p. 1-20.

Research output: Contribution to journalArticlepeer-review

HarvardHarvard

APA

CBE

MLA

VancouverVancouver

Landsiedel J, Koldewyn K. Auditory dyadic interactions through the “eye” of the social brain: How visual is the posterior STS interaction region? Imaging Neuroscience. 2023 Aug 10;1(1):1-20. doi: https://doi.org/10.1162/imag_a_00003

Author

RIS

TY - JOUR

T1 - Auditory dyadic interactions through the “eye” of the social brain: How visual is the posterior STS interaction region?

AU - Landsiedel, Julia

AU - Koldewyn, Kami

N1 - © 2023 Massachusetts Institute of Technology. Published under a Creative Commons CC BY-NC 4.0 license.

PY - 2023/8/10

Y1 - 2023/8/10

N2 - Human interactions contain potent social cues that meet not only the eye but also the ear. Although research has identified a region in the posterior superior temporal sulcus as being particularly sensitive to visually presented social interactions (SI-pSTS), its response to auditory interactions has not been tested. Here, we used fMRI to explore brain response to auditory interactions, with a focus on temporal regions known to be important in auditory processing and social interaction perception. In Experiment 1, monolingual participants listened to two-speaker conversations (intact or sentence-scrambled) and one-speaker narrations in both a known and an unknown language. Speaker number and conversational coherence were explored in separately localised regions-of-interest (ROI). In Experiment 2, bilingual participants were scanned to explore the role of language comprehension. Combining univariate and multivariate analyses, we found initial evidence for a heteromodal response to social interactions in SI-pSTS. Specifically, right SI-pSTS preferred auditory interactions over control stimuli and represented information about both speaker number and interactive coherence. Bilateral temporal voice areas (TVA) showed a similar, but less specific, profile. Exploratory analyses identified another auditory-interaction sensitive area in anterior STS. Indeed, direct comparison suggests modality specific tuning, with SI-pSTS preferring visual information while aSTS prefers auditory information. Altogether, these results suggest that right SI-pSTS is a heteromodal region that represents information about social interactions in both visual and auditory domains. Future work is needed to clarify the roles of TVA and aSTS in auditory interaction perception and further probe right SI-pSTS interaction-selectivity using non-semantic prosodic cues.

AB - Human interactions contain potent social cues that meet not only the eye but also the ear. Although research has identified a region in the posterior superior temporal sulcus as being particularly sensitive to visually presented social interactions (SI-pSTS), its response to auditory interactions has not been tested. Here, we used fMRI to explore brain response to auditory interactions, with a focus on temporal regions known to be important in auditory processing and social interaction perception. In Experiment 1, monolingual participants listened to two-speaker conversations (intact or sentence-scrambled) and one-speaker narrations in both a known and an unknown language. Speaker number and conversational coherence were explored in separately localised regions-of-interest (ROI). In Experiment 2, bilingual participants were scanned to explore the role of language comprehension. Combining univariate and multivariate analyses, we found initial evidence for a heteromodal response to social interactions in SI-pSTS. Specifically, right SI-pSTS preferred auditory interactions over control stimuli and represented information about both speaker number and interactive coherence. Bilateral temporal voice areas (TVA) showed a similar, but less specific, profile. Exploratory analyses identified another auditory-interaction sensitive area in anterior STS. Indeed, direct comparison suggests modality specific tuning, with SI-pSTS preferring visual information while aSTS prefers auditory information. Altogether, these results suggest that right SI-pSTS is a heteromodal region that represents information about social interactions in both visual and auditory domains. Future work is needed to clarify the roles of TVA and aSTS in auditory interaction perception and further probe right SI-pSTS interaction-selectivity using non-semantic prosodic cues.

U2 - https://doi.org/10.1162/imag_a_00003

DO - https://doi.org/10.1162/imag_a_00003

M3 - Article

C2 - 37719835

VL - 1

SP - 1

EP - 20

JO - Imaging Neuroscience

JF - Imaging Neuroscience

SN - 2837-6056

IS - 1

ER -