Next-generation recurrent network models for cognitive neuroscience

Next-generation recurrent network models for cognitive neuroscience

Date Posted:  June 16, 2021
Date Recorded:  June 15, 2021
CBMM Speaker(s):  Guangyu Robert Yang
  • All Captioned Videos
  • CBMM Special Seminars
Description: 

Recurrent Neural Networks (RNNs) trained with machine learning techniques on cognitive tasks have become a widely accepted tool for neuroscientists. In comparison to traditional computational models in neuroscience, RNNs can offer substantial advantages at explaining complex behavior and neural activity patterns. Their use allows rapid generation of mechanistic hypotheses for cognitive computations. RNNs further provide a natural way to flexibly combine bottom-up biological knowledge with top-down computational goals into network models. However, early works of this approach are faced with fundamental challenges. In this talk, I will discuss some of these challenges, and several recent steps that we took to partly address them and to build next-generation RNN models for cognitive neuroscience.​