Search Papers | Poster Sessions | All Posters

Poster C97 in Poster Session C - Friday, August 9, 2024, 11:15 am – 1:15 pm, Johnson Ice Rink

Brain-inspired synaptic rule for adaptive continual learning in deep neural networks

Suhee Cho1, Hyeonsu Lee1, Se-Bum Paik1 (); 1Korea Advanced Institute of Science and Technology

While deep neural networks (DNNs) outperform humans in various tasks, they still encounter challenges in continual learning due to catastrophic forgetting. To tackle this challenge, we propose a model emulating the brain's ability to memorize sequential information. We specifically targeted the serial position effect and the Hebb repetition effect, which illustrate the working memory's capacity to retain sequential information. Inspired by synapses within the working memory system, we designed synapses with various flexibilities and randomly distributed them within the network, training the network incrementally to learn classes. As a result, our model successfully replicates the serial position effect by effectively memorizing items learned both earlier and later in the sequence. It further reproduces the Hebb repetition effect, enhancing memory through repetitive learning. Consequently, our model adaptively allocates memory resources to sequentially presented information, suggesting a potential pathway for adaptive continual learning in DNNs.

Keywords: continual learning incremental learning catastrophic forgetting working memory 

View Paper PDF