Search Papers | Poster Sessions | All Posters

Poster A104 in Poster Session A - Tuesday, August 6, 2024, 4:15 – 6:15 pm, Johnson Ice Rink

Data-driven deep neural network models of visual processing in Drosophila

Jonathan Skaza1 (), Erin Wong2, Arie Matsliah3,4, Benjamin Cowley1; 1Cold Spring Harbor Laboratory, 2Great Neck South High School, 3Princeton Neuroscience Institute, 4Princeton University

Visual projection neurons (VPNs) in Drosophila melanogaster's visual system integrate and project optic lobe information to the central brain. However, the specific visual features integrated by various VPNs are not well understood. Understanding this neural code is crucial for uncovering the inner workings of visuomotor transformations during behaviors like courtship and flight. We utilized VPN recordings from multiple studies to train deep neural network (DNN) models, including classic convolutional and connectome-inspired DNNs, to predict neural responses of VPNs within the optic glomeruli. Our models revealed the stimulus preferences and temporal properties for each optic glomerulus (OG). We found that despite large differences in architecture, the DNN models had similar accuracy in predicting OG responses. Thus, the artificial stimuli traditionally used to probe visual function—moving spots and bars—are too impoverished to distinguish competing models. We propose a new class of stimuli, optimized by our models, that maximize the differences in predicted responses between models. Presenting these "controversial" stimuli in future experiments will better refine our DNN models and unlock further insights into fruit fly visual processing.

Keywords: visual system optic lobe lobula complex deep learning 

View Paper PDF