Search Papers | Poster Sessions | All Posters

Poster A63 in Poster Session A - Tuesday, August 6, 2024, 4:15 – 6:15 pm, Johnson Ice Rink

Optimization of fully differentiable ODE neurons using gradient descent

Ilenna Jones1,2 (), Konrad Kording2; 1Harvard University, 2University of Pennsylvania

Neuroscientists fit simulations of single neurons to data. Fitting morphologically and biophysically detailed neuron models is computationally expensive as typical gradient-free approaches, such as evolutionary algorithms, converge slowly for neurons with many parameters. Here we introduce a gradient-based algorithm using differentiable ODE solvers, a class of models that scales well to high-dimensional problems. We employ GPUs to efficiently run many morphologically detailed neuron simulations in parallel and thus fit heterogeneously distributed ion channel densities. We use this efficient optimization algorithm to provide a proof of concept by fitting models analogous to specific experimental conditions in less than 4 hours on 1 GPU. We find that individually stimulating all dendritic compartments of the model produces outputs that lead to identifiable models. It reliably converges, even when limited numbers of recording sites. However, limiting stimulation sites reduces the reliability of this optimization method. Our approach makes model fitting efficient with the potential to allow models to have many parameters. Differentiable neuron models promise a new era of optimizable neuron models with many free parameters, a key feature of real neurons.

Keywords: conductance model neuron modelling optimization gradient descent 

View Paper PDF