CBMM faculty offer graduate and undergraduate courses that integrate computational and empirical approaches used in the study of problems related to intelligence. These courses introduce some of the mathematical frameworks used to formulate computational models, and experimental methods used in the fields of neuroscience and cognitive science to study the neural implementations of intelligent processes and manifestation of these computations in human cognitive behavior. Examples of the integration of these perspectives are drawn from current research on intelligence. Materials for many of these courses are available online. Most graduate courses are open to advanced undergraduates with appropriate background. Enrollment for courses is handled through the respective institutions.

Spring 2019

Massachusetts Institute of Technology (MIT)

The Human Brain
Surveys the core perceptual and cognitive abilities of the human mind and asks how these are implemented in the brain. Key themes include the functional organization of the cortex, as well as the representations and computations, developmental origins, and degree of functional specificity of particular cortical regions. Emphasizes the methods available in human cognitive neuroscience, and what inferences can and cannot be drawn from each.
Cognitive Science
Edward Gibson, Pawan Sinha
This class is the second half of an intensive survey of cognitive science for first-year graduate students. Topics include visual perception, language, memory, cognitive architecture, learning, reasoning, decision-making, and cognitive development. Topics covered are from behavioral, computational, and neural perspectives.
sound waves
The course covers foundations and recent advances in the study of perception: the process of deriving information about the world from our sensory receptors.
Neural Coding and Perception of Sound
Daniel Polley, Bertrand Delgutte, M. C. Brown
(Same subject as HST.723[J]) Neural structures and mechanisms mediating the detection, localization and recognition of sounds. General principles are conveyed by theme discussions of auditory masking, sound localization, musical pitch, cochlear implants, cortical plasticity and auditory scene analysis. Follows Harvard FAS calendar.

Harvard University

robot shape made up of word cloud
This course provides a foundational overview of the fundamental ideas in computational neuroscience and the study of Biological Intelligence. At the same time, the course will connect the study of brains to the blossoming and rapid development of ideas in Artificial Intelligence. Topics covered include the biophysics of computation, neural networks, machine learning, Bayesian models, theory of learning, deep convolutional networks, generative adversarial networks, neural coding, control and dynamics of neural activity, applications to brain-machine interfaces, connectomics, among others. Lectures will be taught by leading Harvard experts in the field.
Computational Cognitive Neuroscience
"What I cannot create, I do not understand." – Richard Feynman This course applies Richard Feynman's dictum to the brain, by teaching students how to simulate brain function with computer programs. Special emphasis will be placed on how neurobiological mechanisms give rise to cognitive processes like learning, memory, decision-making, and object perception. Students will learn how to understand experimental data through the lens of computational models, and ultimately how to build their own models.
Hodkin-Huxley circuit
Follows trends in modern brain theory, focusing on local neuronal circuits and deep architectures. Explores the relation between network structure, dynamics, and function. Introduces tools from information theory, dynamical systems, statistics, and learning theory in the study of experience-dependent neural computation. Specific topics include: computational principles of early sensory systems; unsupervised, supervised and reinforcement learning; attractor computation and memory in recurrent cortical circuits; noise, chaos, and coding in neuronal systems; learning and computation in deep networks in the brain and in AI systems.

Johns Hopkins University

Vision as Bayesian Inference
This is an advanced course on computer vision from a probabilistic and machine learning perspective. It covers techniques such as linear and non-linear filtering, geometry, energy function methods, markov random fields, conditional random fields, graphical models, probabilistic grammars, and deep neural networks. These are illustrated on a set of vision problems ranging from image segmentation, semantic segmentation, depth estimation, object recognition, object parsing, scene parsing, action recognition, and text captioning.
IAP 2019

Massachusetts Institute of Technology (MIT)

Photo of microscope
Course will be held the third week of IAP (week of January 28th), M-F, from 2-5pm, in MIT room #46-3015 Provides instruction and dialogue on practical ethical issues relating to the responsible conduct of human and animal research in the brain and cognitive sciences. Specific emphasis on topics relevant to young researchers including data handling, animal and human subjects, misconduct, mentoring, intellectual property, and publication. Preliminary assigned readings and initial faculty lecture followed by discussion groups of four to five students each. A short written summary of the discussions submitted at the end of each class. See IAP Guide for registration information. 

Evolution, Computation, and Learning
Daniel Czegel
Here, we will explore recent work in evolutionary computation and theoretical biology modeling the processes of evolution. Namely, we will focus on these broad questions: What are the processes that govern evolution or 'evolutionary learning'? How can these processes improve upon or inspire new models or theories of learning, search, and/or development? If any, what is the role of evolutionary computation or theoretical biology in investigating human cognition or developing AI? Are there any frameworks, theories, or models that we can import from these fields?