CBMM faculty offer graduate and undergraduate courses that integrate computational and empirical approaches used in the study of problems related to intelligence. These courses introduce some of the mathematical frameworks used to formulate computational models, and experimental methods used in the fields of neuroscience and cognitive science to study the neural implementations of intelligent processes and manifestation of these computations in human cognitive behavior. Examples of the integration of these perspectives are drawn from current research on intelligence. Materials for many of these courses are available online. Most graduate courses are open to advanced undergraduates with appropriate background. Enrollment for courses is handled through the respective institutions.

Fall 2018

Massachusetts Institute of Technology (MIT)

Close up image of squid skin
Tutorial series in computational topics related to brain and cognitive sciences. Each tutorial will consist of a short lecture, and then 'office hours' time to work through practice problems, and discuss problems people want help with in their own research. Food will be provided.

Harvard University

Visual Object Recognition: Computational and Biological Mechanisms
Visual recognition is essential for most everyday tasks including navigation, reading and socialization, and is also important for engineering applications such as automatic analysis of clinical images, face recognition by computers, security tasks and automatic navigation. In spite of the enormous increase in computational power over the last decade, humans still outperform the most sophisticated engineering algorithms in visual recognition tasks. This course examines how circuits of neurons in visual cortex represent and transform visual information, covering the following topics: functional architecture of visual cortex, lesion studies, physiological experiments in humans and animals, visual consciousness, computational models of visual object recognition, computer vision algorithms.
Neurons by Penn State
This course introduces students to abstract models of what and how neurons compute and concrete analyses of real neurons in action. Topics include network models of sensory processing and memory, and techniques to compare these models with real experimental data. This course will emphasize students' contributions and classroom interactions. Programming projects will be a significant aspect of the course, so programming experience (Python, Matlab) is recommended. Familiarity, but not expertise, with linear algebra and differential equations will be assumed.
Spring 2018

Massachusetts Institute of Technology (MIT)

The Human Intelligence Enterprise
Analyzes seminal work directed at the development of a computational understanding of human intelligence, such as work on learning, language, vision, event representation, commonsense reasoning, self reflection, story understanding, and analogy. Reviews visionary ideas of Turing, Minsky, and other influential thinkers. Examines the implications of work on brain scanning, developmental psychology, and cognitive psychology. Emphasis on discussion and analysis of original papers. Students taking graduate version complete additional assignments.
Neurotechnology in Action
Dr. Maxine Jonas, Prof. Alan Jasanoff
Offers a fast-paced introduction to numerous laboratory methods at the forefront of modern neurobiology. Comprises a sequence of modules focusing on neurotechnologies that are developed and used by MIT research groups. Each module consists of a background lecture and 1-2 days of firsthand laboratory experience. Topics typically include optical imaging, optogenetics, high throughput neurobiology, MRI/fMRI, advanced electrophysiology, viral and genetic tools, and connectomics.
IAP 2018

Massachusetts Institute of Technology (MIT)

Memory Wars
Lindsey Williams
Research in science is driven by frameworks and hypotheses that determine the design and interpretation of experiments and how the field evolves. A critical discussion of these hypotheses can: raise awareness of the current state of the field, gain familiarity with terminology and concepts, sharpen critical thinking skills, and develop intuition to design effective experiments to tackle key open questions.
Fall 2017

University of Central Florida

Course Description: Lecture and workshop series on introductory topics related to Artificial Intelligence. Each unit in the series consists of lectures on the topic and then workshops focused on building the systems covered in the lecture(s). Topics include neural networks, reinforcement learning, [neuro]evolutionary computation, and building machines that learn and think like people.
Spring 2017

Massachusetts Institute of Technology (MIT)

Cognitive Neuroscience
Earl Miller
Explores the cognitive and neural processes that support attention, vision, language, motor control, navigation, and memory. Introduces basic neuroanatomy, functional imaging techniques, and behavioral measures of cognition. Discusses methods by which inferences about the brain bases of cognition are made. Considers evidence from human and animal models. Students prepare presentations summarizing journal articles.

Harvard University

This course examines recent work applying computational models to mental disorders. These models formalize psychopathology in terms of breakdown in fundamental neurocognitive processes, linking normal and abnormal brain function within a common framework. Computational modeling has already begun to yield insights, and even possible treatments, for a wide range of disorders, including schizophrenia, autism, Parkinson’s, depression, obsessive-compulsive disorder, and attention-deficit hyperactivity disorder. The course will consist of weekly readings from the primary literature, with one student leading the discussion of each paper
Computational Neuroscience
Follows trends in modern brain theory, focusing on local neuronal circuits and deep architectures. Explores the relation between network structure, dynamics, and function. Introduces tools from information theory, dynamical systems, statistics, and learning theory in the study of experience-dependent neural computation. Specific topics include: computational principles of early sensory systems; unsupervised, supervised and reinforcement learning; attractor computation and memory in recurrent cortical circuits; noise, chaos, and coding in neuronal systems; learning and computation in deep networks in the brain and in AI systems. Cross-listed in Physics and SEAS.