Search Papers | Poster Sessions | All Posters

Poster B18 in Poster Session B - Thursday, August 8, 2024, 1:30 – 3:30 pm, Johnson Ice Rink

Predicting human behavioral decisions with recurrent neural networks

Yu-Ang Cheng1, Ivan Felipe Rodriguez1, Thomas Serre1 (); 1Department of Cognitive Linguistic & Psychological Sciences, Brown University

Current neural network modeling work in visual recognition has focused primarily on matching behavioral choices and related accuracy measures. Visual perception is a dynamic process that unfolds in time, but moving beyond characterizing choice patterns to capturing temporal aspects of visual decision-making has been challenging. We introduce a novel computational framework to optimize recurrent neural networks (RNNs) response times. First, we consider a random dot motion task and show how an RNN can be fitted to human psychophysics data. Second, we train an ideal observer RNN model to maximize a tradeoff between speed and accuracy. Our results indicate that human-like reaction time distributions can naturally emerge in a neural network explicitly optimized to solve a task in minimal computing time. Finally, we use our approach with a biological-plausible circuit model of decision-making known as the Wong-Wang model. We show that it is possible to stack this module on top of a task-optimized convolutional neural network to fit human behavioral data. Overall, our results suggest that the proposed framework can be effectively used to fit models of visual perception with the full set of human behavioral data, bringing us one step closer to an integrated model of human visual perception.

Keywords: Recurrent Neural Networks perceptual decision-making human-AI alignment reaction times 

View Paper PDF