Search Papers | Poster Sessions | All Posters

Poster B66 in Poster Session B - Thursday, August 8, 2024, 1:30 – 3:30 pm, Johnson Ice Rink

Human-like behavior and neural representations emerge in a goal-driven model of overt visual search for natural objects

Motahareh Pour Ahimi1,2, Irina Rish2, Pouya Bashivan1,2 (); 1McGill University, 2Mila

Like many other animals, humans direct their gaze to selectively sample the visual space based on task demands. Visual search, the process of locating a specific item among several visually presented objects, is a key paradigm in studying visual attention. While much is known about the brain networks underlying visual search, our understanding of the neural computations driving this behavior is limited, leading to challenges in simulating such behavior in-silico. To address this gap, we trained an image-computable artificial neural network to perform naturalistic visual search. After training, the model demonstrated strong generalization in search performance to novel object categories while exhibiting high behavioral consistency with human subjects. Further analysis of the model's population activity revealed an egocentric representation of the priority map, akin to those described in macaques, that persisted in time and was updated with each saccade alongside encoding of the cued object category in a separate subspace. Our model provides a computational framework for further studying the neural circuits underlying visual search in the primate's fronto-parietal cortical network.

Keywords: visual search artificial neural network saccadic behavior priority map 

View Paper PDF