Search Papers | Poster Sessions | All Posters

Poster A22 in Poster Session A - Tuesday, August 6, 2024, 4:15 – 6:15 pm, Johnson Ice Rink

Latent cause inference as an efficient and flexible learning rule for cognitive graphs

Jungsun Yoo1 (), Dale Zhou1, Aaron M. Bornstein1; 1University of California, Irvine

In complex and continuous environments, agents acquire knowledge from experience by extracting structural – either spatial or temporal – regularities from sequential observations. Cognitive graphs based on Hidden-Markov Models (HMM) offer an efficient framework for elucidating the mechanisms of how agents learn latent environment structure. However, a leading algorithm implementing this approach (“Clone-Structured Cognitive Graphs”; CSCG) assumes a fixed allocation of neural resources to this problem, which may undermine biological plausibility and prove inefficient/inflexible for learning environments of unknown complexity. Here, we replace the fixed allocation of neural resources with a rational procedure that adapts the complexity of the internal representation according to nonparametric inference of latent structure. We demonstrate that, on the same benchmarks used to validate the original algorithm, our modification enhances efficiency without sacrificing performance. Our result suggests that this adaptive construction of cognitive graphs could potentially benefit learning in environments with unknown state-space complexity, and may thus provide a better explanation of behavior in resource-constrained biological organisms.

Keywords: clone-structured cognitive graphs latent cause inference Hidden Markov Models 

View Paper PDF