Search Papers | Poster Sessions | All Posters

Poster B102 in Poster Session B - Thursday, August 8, 2024, 1:30 – 3:30 pm, Johnson Ice Rink

Impact of dendritic non-linearities on the computational capabilities of neurons

Clarissa Lauditi1, Carlo Baldassi2, Nicolas Brunel2, Enrico M. Malatesta2, Fabrizio Pittorino3, Riccardo Zecchina; 1Harvard University, 2Bocconi Univesity, 3Politecnico of Milan

Recent experiments in neurophysiology, primarily in pyramidal cells, have shown that dendrites contribute to neuronal computational capabilities with non-linear synaptic input integration. In this work we model a single neuron as a two-layer network with non-overlapping synaptic weights and a biologically plausible form of dendritic non-linearity, which is analytically tractable with statistical physics methods. Analytical and numerical analysis of the model reveals key computational benefits of non-linear dendritic integration over traditional linear neuron models. We find that the dendritic non-linearity concurrently enhances the number of possible learned input-output associations and the learning speed. At variance with previously studied linear neuron models, we find that the experimentally observed synaptic weight sparsity naturally emerges as a consequence of non-linear dendritic integration, while the experimental synaptic weight distribution is consistently reproduced. Non-linearly induced sparsity comes with a second advantage for information processing, i.e. input and synaptic noise robustness. By testing our model on standard real-world benchmark datasets inspired by deep learning practice, we empirically observe that the non-linearity provides an enhancement in generalization performance - a desirable property of neurons for non-trivial information processing.

Keywords: Single neuron computation Learning Synaptic plasticity Dendritic non-linearity 

View Paper PDF