Normalization models of attention

Normalization models of attention

Date Posted:  November 19, 2024
Date Recorded:  November 13, 2024
Speaker(s):  Rachel Denison, Boston University
  • All Captioned Videos
  • Computational Tutorials
Description: 

Attention is a cognitive process that allows us to prioritize the sensory information that is most relevant for our behavioral goals. In a successful class of computational models of attention, attention biases neural responses through its interaction with normalization—a canonical neural computation that promotes efficient representations across sensory and cognitive systems. Normalization models of attention have provided quantitative explanations for a wide range of findings on how attention affects neural activity and perception, making them a powerful example of a computational framework for top-down modulation. In this tutorial, we will learn about normalization models of attention—including some of our recent work on dynamic attention—with hands-on Matlab exercises.

Links to code: