Search Papers | Poster Sessions | All Posters

Poster C159 in Poster Session C - Friday, August 9, 2024, 11:15 am – 1:15 pm, Johnson Ice Rink

Investigating the neural computations underlying visual social inference with graph neural networks

Manasi Malik1 (), Minjae Kim1, Shari Liu1, Tianmin Shu1, Leyla Isik1; 1Johns Hopkins University

Recognizing people's interactions in visual scenes is a crucial human ability; however, the neural computations that enable this remain largely unknown. Prior work demonstrates that a bottom-up, visual model with relational inductive biases (based on graph-neural-networks) successfully captures human behavior in social interaction judgments, suggesting that relational visual representations may underlie this ability. If relational visual computations are fundamental to social perception, then we should find evidence for them in brain regions that support social perception, such as lateral occipital temporal cortex (LOTC) and posterior STS (pSTS). To test this, we collected fMRI data from adults watching animated shape videos of two agents interacting in a friendly, neutral, or adversarial manner. Preliminary analysis using whole-brain-searchlight representational similarity analysis (RSA) shows a correlation between neural and behavioral representations in both the above social perception regions and the theory of mind network. The graph-neural-network model also explains responses in LOTC and pSTS. In contrast, a matched bottom-up model without relational inductive biases correlates poorly with neural data. Our work suggests brain regions in LOTC and pSTS that support social interaction perception rely on relational visual information, and provides a novel modeling framework for investigating the neural computations underlying social perception and cognition.

Keywords: graph neural networks social interaction recognition fMRI neuroAI 

View Paper PDF