Search Papers | Poster Sessions | All Posters

Poster A147 in Poster Session A - Tuesday, August 6, 2024, 4:15 – 6:15 pm, Johnson Ice Rink

Neural Processing of Naturalistic Audiovisual Events in Space and Time

Yu Hu1, Yalda Mohsenzadeh1,2; 1Western University, 2Vector Institute for Artificial Intelligence

What we see and hear carry different physical properties, but our brain can integrate distinct information to form a coherent percept. However, when real-world audiovisual events are perceived, the specific brain regions and timings for processing and integrating different levels of information remain less investigated. To address that, we curated naturalistic videos and recorded fMRI and EEG data when participants viewed videos with accompanying sounds. We found the acoustic information was represented not only in auditory areas but also in early visual regions, suggesting the early cross-modal interaction and its role in combining acoustic features with visual features. However, the visual information was only represented in visual cortices, indicating that the early cross-modal interaction is asymmetrical. The visual and auditory features were processed with similar onset but different temporal dynamics. The high-level categorical and semantic information was identified in high-order and multi-modal association areas and resolved later in time, demonstrating the late cross-modal integration and its distinct role in converging conceptual information. We further compared the neural representations with a two-branch deep neural network model and observed the mismatch in early cross-model interaction, suggesting the need for improvement to build a more biologically plausible model for audiovisual perception.

Keywords: Audiovisual Perception Naturalistic Stimuli Computational Models Neural Representations and Dynamics 

View Paper PDF