Search Papers | Poster Sessions | All Posters

Poster B105 in Poster Session B - Thursday, August 8, 2024, 1:30 – 3:30 pm, Johnson Ice Rink

Learning Dynamics of Linear Recurrent Neural Networks

Alexandra M. Proca1 (), Murray Shanahan1, Pedro A.M. Mediano1; 1Imperial College London

Recurrent neural networks (RNNs) are widely used in neuroscience to model neural dynamics and learn tasks with temporal dependencies, and have been shown to utilize complex dynamical structures. However, it is still unknown how such structures emerge during training. Here, we aim to develop a better theoretical understanding of learning dynamics in RNNs by analyzing their linear counterparts analytically. Despite the absence of nonlinearity, deep linear networks are known to exhibit nonlinear learning dynamics. We show that the effect of exploding gradients acts as an effective regularizer of both recurrent and input-output weights and derive exact solutions of the nonlinear learning dynamics of the input-output connectivity modes, verified in simulation. Finally, we study the loss landscape and gradients for data with different temporal structures, revealing (un)learnable data dynamics and their solutions, criteria for generalization across trajectory lengths, and the existence of a bifurcation leading parameters towards either the global minimum or suboptimal solutions. Our work provides a first analytical treatment of the relationship between temporally-evolving data and learning dynamics in linear RNNs and builds a basis from which we can better understand how complex dynamic behavior emerges in cognitive models.

Keywords: learning dynamics RNNs linear networks 

View Paper PDF