Search Papers | Poster Sessions | All Posters

Poster A120 in Poster Session A - Tuesday, August 6, 2024, 4:15 – 6:15 pm, Johnson Ice Rink

Probing Human Vision via an Image-to-EEG Encoding Model

Zitong Lu1 (), Julie Golomb1; 1The Ohio State University

Understanding the complex interplay between visual stimuli and brain activity has been a focal point in cognitive neuroscience. The recent advent of artificial intelligence (AI) provides novel insights for experimental and computational neuroscience research. In this study, we developed a pioneering encoding framework, called “Img2EEG”, as an innovative tool for investigating visual mechanisms. Trained on a large-scale EEG dataset of natural images at the individual subject level, Img2EEG effectively learns individualized brain-optimized features and generates highly realistic EEG signals given any image input. Using Img2EEG, we can track the temporal dynamics underlying visual processes, and uncover possible mechanisms of individual differences in visual perception. Moreover, feeding Img2EEG novel sets of images distinctly varied from its original training dataset, the artificially-generated EEG signal reproduced classic face-specific ‘N170’ ERP and object feature multivariate pattern analysis results. Furthermore, our Img2EEG encoding model can also conduct EEG-to-image zero-shot retrieval task, outperforming current state-of-the-art EEG decoding models. Overall, Img2EEG mapping from visual inputs to high temporal resolution brain signals offers novel and powerful approaches to probe human visual representations.

Keywords: Encoding Model EEG Visual Perception 

View Paper PDF