CBMM faculty offer graduate and undergraduate courses that integrate computational and empirical approaches used in the study of problems related to intelligence. These courses introduce some of the mathematical frameworks used to formulate computational models, and experimental methods used in the fields of neuroscience and cognitive science to study the neural implementations of intelligent processes and manifestation of these computations in human cognitive behavior. Examples of the integration of these perspectives are drawn from current research on intelligence. Materials for many of these courses are available online. Most graduate courses are open to advanced undergraduates with appropriate background. Enrollment for courses is handled through the respective institutions.

Spring 2017

Massachusetts Institute of Technology (MIT)

The Human Intelligence Enterprise
Analyzes seminal work directed at the development of a computational understanding of human intelligence, such as work on learning, language, vision, event representation, commonsense reasoning, self reflection, story understanding, and analogy. Reviews visionary ideas of Turing, Minsky, and other influential thinkers. Examines the implications of work on brain scanning, developmental psychology, and cognitive psychology. Emphasis on discussion and analysis of original papers. Students taking graduate version complete additional assignments.
Cognitive Neuroscience
Earl Miller
Explores the cognitive and neural processes that support attention, vision, language, motor control, navigation, and memory. Introduces basic neuroanatomy, functional imaging techniques, and behavioral measures of cognition. Discusses methods by which inferences about the brain bases of cognition are made. Considers evidence from human and animal models. Students prepare presentations summarizing journal articles.
Cognitive Science
Edward Gibson, Pawan Sinha
This class is the second half of an intensive survey of cognitive science for first-year graduate students. Topics include visual perception, language, memory, cognitive architecture, learning, reasoning, decision-making, and cognitive development. Topics covered are from behavioral, computational, and neural perspectives.
sound waves
Studies how the senses work and how physical stimuli are transformed into signals in the nervous system. Examines how the brain uses those signals to make inferences about the world, and uses illusions and demonstrations to gain insight into those inferences. Emphasizes audition and vision, with some discussion of touch, taste, and smell. Provides experience with psychophysical methods.

Harvard University

Computational Neuroscience
Follows trends in modern brain theory, focusing on local neuronal circuits and deep architectures. Explores the relation between network structure, dynamics, and function. Introduces tools from information theory, dynamical systems, statistics, and learning theory in the study of experience-dependent neural computation. Specific topics include: computational principles of early sensory systems; unsupervised, supervised and reinforcement learning; attractor computation and memory in recurrent cortical circuits; noise, chaos, and coding in neuronal systems; learning and computation in deep networks in the brain and in AI systems. Cross-listed in Physics and SEAS.
This course examines recent work applying computational models to mental disorders. These models formalize psychopathology in terms of breakdown in fundamental neurocognitive processes, linking normal and abnormal brain function within a common framework. Computational modeling has already begun to yield insights, and even possible treatments, for a wide range of disorders, including schizophrenia, autism, Parkinson’s, depression, obsessive-compulsive disorder, and attention-deficit hyperactivity disorder. The course will consist of weekly readings from the primary literature, with one student leading the discussion of each paper
IAP 2017

Massachusetts Institute of Technology (MIT)

Photo of microscope
Provides instruction and dialogue on practical ethical issues relating to the responsible conduct of human and animal research in the brain and cognitive sciences. Specific emphasis on topics relevant to young researchers including data handling, animal and human subjects, misconduct, mentoring, intellectual property, and publication. Preliminary assigned readings and initial faculty lecture followed by discussion groups of four to five students each. A short written summary of the discussions submitted at the end of each class. See IAP Guide for registration information.
Fall 2016

Massachusetts Institute of Technology (MIT)

Aspects of a Computational Theory of Intelligence
Explores the problem of intelligence - its nature, how it is produced by the brain and how it could be replicated in machines - with an approach that integrates computational modeling, neuroscience and cognitive science. Focuses on four intellectual thrusts: how intelligence is grounded in computation, how these computations develop in childhood, how they are implemented in neural systems, and how social interaction enhances these computations. Research within these thrusts is integrated through an overarching theme of how they contribute to a computational account of how humans analyze dynamic visual imagery to understand objects and actions in the world.
Statistical Learning Theory and Applications
Provides students with the knowledge needed to use and develop advanced machine learning solutions to challenging problems. Covers foundations and recent advances of machine learning in the framework of statistical learning theory. Focuses on regularization techniques key to high-dimensional supervised learning. Starting from classical methods such as regularization networks and support vector machines, addresses state-of-the-art techniques based on principles such as geometry or sparsity, and discusses a variety of algorithms for supervised learning, feature selection, structured prediction, and multitask learning. Also focuses on unsupervised learning of data representations, with an emphasis on hierarchical (deep) architectures.
Computational Cognitive Science
Introduction to computational theories of human cognition. Focuses on principles of inductive learning and inference, and the representation of knowledge. Computational frameworks include Bayesian and hierarchical Bayesian models, probabilistic graphical models, nonparametric statistical models and the Bayesian Occam's razor, sampling algorithms for approximate learning and inference, and probabilistic models defined over structured representations such as first-order logic, grammars, or relational schemas. Applications to understanding core aspects of cognition, such as concept learning and categorization, causal reasoning, theory formation, language acquisition, and social inference.