Search Papers | Poster Sessions | All Posters

Poster B83 in Poster Session B - Thursday, August 8, 2024, 1:30 – 3:30 pm, Johnson Ice Rink

Tracking in space and features with phase synchrony

Sabine Muzellec1,2 (), Drew Linsley1,3, Alekh K. Ashok1,3, Rufin VanRullen2, Thomas Serre1,3; 1Carney Institute for Brain Science, Brown University, 2CerCo - CNRS, 3Department of Cognitive Linguistic & Psychological Sciences, Brown University

Healthy humans depend on their ability to track objects while they move through the world even as they change in appearance. Here, we introduce the FeatureTracker challenge to systematically evaluate and compare the abilities of humans and state-of-the-art deep neural networks (DNNs) to track objects that change in appearance over time. While humans can effortlessly solve this task, DNNs cannot. Drawing inspiration from cognitive science and neuroscience, we describe a novel recurrent neural circuit that can induce this tracking capability in DNNs by leveraging the oscillatory activity of its neurons to follow objects even as their appearances change. The resulting complex-valued recurrent neural network (CV-RNN) outperformed all other DNNs and approached human accuracy on the FeatureTracker challenge. The success of this novel neural circuit provides computational evidence for a long-hypothesized role of phase synchronization for visual attention and reasoning.

Keywords: Neural circuits Object tracking Synchrony 

View Paper PDF