Search Papers | Poster Sessions | All Posters

Poster C9 in Poster Session C - Friday, August 9, 2024, 11:15 am – 1:15 pm, Johnson Ice Rink

Deep Neural Networks Are Predictive of Neural Data Through Textures

Jessica Loke1 (), Lynn K.A. Soerensen2, Iris I.A. Groen1, H. Steven Scholte1; 1University of Amsterdam, 2Massachusetts Institute of Technology

Human visual processing is well predicted by deep neural networks (DNNs), yet what drives this predictive power is less understood. Interestingly, human visual cortices have recently been reported to represent objects in a texture-like fashion, akin to a texture bias commonly observed in DNNs. We hypothesized that this alignment of DNNs with human neural recordings is driven by DNNs’ ability to explain variance related to texture information in images. To test this, we recorded electroencephalography (EEG) signals from human participants (n=57) while they viewed three types of images: natural images, texture-synthesized, and object-only versions. Next, we compared these neural representations with features extracted from five different DNN architectures processing the same images. Our results show that features extracted from texture-synthesized images are just as predictive of EEG responses as features extracted from original images themselves. Moreover, features extracted from texture-synthesized images were most predictive of EEG responses for texture-synthesized images. Our results suggest that DNN’s ability to predict neural data derives from a shared bias for textures in the human visual cortex.

Keywords: Visual processing Object recognition EEG Texture 

View Paper PDF