Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Grab it
Learn about computational models of attention through a comprehensive tutorial that explores how attention interacts with normalization in neural systems. Dive into the mechanics of how attention biases neural responses through normalization - a fundamental neural computation that enables efficient sensory and cognitive processing. Work through hands-on MATLAB exercises to understand quantitative explanations for attention's effects on neural activity and perception, including recent research on dynamic attention. Access accompanying code through the provided GitHub repository for a normalization model GUI and OSF repository for dynamic attention modeling. Led by Rachel Denison from Boston University, this 71-minute session provides a powerful computational framework for understanding top-down attentional modulation in the brain.
Normalization Models of Attention - From Neural Computation to Behavioral Goals