CBMM faculty offer graduate and undergraduate courses that integrate computational and empirical approaches used in the study of problems related to intelligence. These courses introduce some of the mathematical frameworks used to formulate computational models, and experimental methods used in the fields of neuroscience and cognitive science to study the neural implementations of intelligent processes and manifestation of these computations in human cognitive behavior. Examples of the integration of these perspectives are drawn from current research on intelligence. Materials for many of these courses are available online. Most graduate courses are open to advanced undergraduates with appropriate background. Enrollment for courses is handled through the respective institutions.

Fall 2021

Massachusetts Institute of Technology (MIT)

Statistical Learning Theory and Applications
Provides students with the knowledge needed to use and develop advanced machine learning solutions to challenging problems. Covers foundations and recent advances of machine learning in the framework of statistical learning theory. Focuses on regularization techniques key to high-dimensional supervised learning. Starting from classical methods such as regularization networks and support vector machines, addresses state-of-the-art techniques based on principles such as geometry or sparsity, and discusses a variety of algorithms for supervised learning, feature selection, structured prediction, and multitask learning. Also focuses on unsupervised learning of data representations, with an emphasis on hierarchical (deep) architectures.
Provides instruction on the mechanistic basis of intelligence - how the brain produces intelligent behavior and how we may be able to replicate intelligence in machines. Examines how human intelligence emerges from computations in neural circuits to reproduce similar intelligent behavior in machines. Working in teams, students complete computational projects and exercises that reinforce the theme of collaboration between (computer science + math) and (neuroscience + cognitive science). Culminates with student presentations of their projects. Instruction and practice in oral and written communication provided.
Computational Cognitive Science
Introduction to computational theories of human cognition. Focuses on principles of inductive learning and inference, and the representation of knowledge. Computational frameworks include Bayesian and hierarchical Bayesian models, probabilistic graphical models, nonparametric statistical models and the Bayesian Occam's razor, sampling algorithms for approximate learning and inference, and probabilistic models defined over structured representations such as first-order logic, grammars, or relational schemas. Applications to understanding core aspects of cognition, such as concept learning and categorization, causal reasoning, theory formation, language acquisition, and social inference.
Introduction to cognitive development focusing on childrens' understanding of objects, agents, and causality. Develops a critical understanding of experimental design. Discusses how developmental research might address philosophical questions about the origins of knowledge, appearance and reality, and the problem of other minds. Provides instruction and practice in written communication as necessary to research in cognitive science (including critical reviews of journal papers, a literature review and an original research proposal), as well as instruction and practice in oral communication in the form of a poster presentation of a journal paper.

Harvard University

Visual recognition is essential for most everyday tasks including navigation, reading and socialization. Visual pattern recognition is also important for many engineering applications such as automatic analysis of clinical images, face recognition by computers, security tasks and automatic navigation. In spite of the enormous increase in computational power over the last decade, humans still outperform the most sophisticated engineering algorithms in visual recognition tasks. In this course, we will examine how circuits of neurons in visual cortex represent and transform visual information. The course will cover the following topics: functional architecture of visual cortex, lesion studies, physiological experiments in humans and animals, visual consciousness, computational models of visual object recognition, computer vision algorithms.
hand ringing bell and dog wagging tail
This course provides a tour of foundational topics in learning from a theoretical perspective. It covers a diversity of learning processes, aiming for breadth over depth (although it inevitably neglects several important forms of learning). Each meeting will consist of student-led presentations of two papers. Experience with computational modeling is not required, but students should have some familiarity with basic math (algebra and probability).
two infants sitting reading books
Despite recent advances in computer science and machine learning, human infants remain the most prodigious learners on the planet. This seminar considers the origins and nature of human cognitive development in four broad domains: knowledge of objects and their physical relationships, knowledge of people and social relationships, knowledge of geometry and the larger spatial layout, and knowledge of numbers and mathematics. We will discuss how these foundational cognitive building blocks support humans? ability to explain, understand, and generalize, skills that are critical for successfully navigating our surroundings. Understanding these core psychological competencies has become essential to progress in many areas of society, including efforts to improve education, to create digital ?cognitive assistants? who help us navigate, plan, and remember things, and to develop human-like artificial intelligence. Building on findings from basic research, we will consider how each of these efforts can be advanced.
Spring 2021

Massachusetts Institute of Technology (MIT)

The Human Brain
Surveys the core perceptual and cognitive abilities of the human mind and asks how these are implemented in the brain. Key themes include the functional organization of the cortex, as well as the representations and computations, developmental origins, and degree of functional specificity of particular cortical regions. Emphasizes the methods available in human cognitive neuroscience, and what inferences can and cannot be drawn from each.
Cognitive Science
Edward Gibson, Pawan Sinha
Intensive survey of cognitive science. Topics include visual perception, language, memory, cognitive architecture, learning, reasoning, decision-making, and cognitive development. Topics covered from behavioral, computational, and neural perspectives.
sound waves
Studies how the senses work and how physical stimuli are transformed into signals in the nervous system. Examines how the brain uses those signals to make inferences about the world, and uses illusions and demonstrations to gain insight into those inferences. Emphasizes audition and vision, with some discussion of touch, taste, and smell. Provides experience with psychophysical methods.
biological drawing of inner ear
Daniel Polley, Bertrand Delgutte, M. C. Brown
Neural structures and mechanisms mediating the detection, localization and recognition of sounds. General principles are conveyed by theme discussions of auditory masking, sound localization, musical pitch, cochlear implants, cortical plasticity and auditory scene analysis. Follows Harvard FAS calendar.

Harvard University

robot shape made up of word cloud
This course provides a foundational overview of the fundamental ideas in computational neuroscience and the study of Biological Intelligence. At the same time, the course will connect the study of brains to the blossoming and rapid development of ideas in Artificial Intelligence. Topics covered include the biophysics of computation, neural networks, machine learning, Bayesian models, theory of learning, deep convolutional networks, generative adversarial networks, neural coding, control and dynamics of neural activity, applications to brain-machine interfaces, connectomics, among others. Lectures will be taught by leading experts in the field. Students will conduct a class project utilizing the knowledge gained in the course.
Life is full of decisions, but not all decisions are made equal. Choices can be big and consequential (should I focus on my success, family, or passion), or small and everyday (going out, or staying in). This course will introduce you to the cognitive science of judging and choosing. You will learn about rational planning, the kind a perfect intelligence might carry out; Common simplifications and shortcuts that non-perfect humans use, and how these may actually be appealing approximations for any decision-making system; Regret over choices taken and not taken; Making decisions with others, Transformative decisions, the ones that change who you are as a person. As we cover these topics, we will consider how to apply the insights from the psychology of decision making to your own ordinary and extraordinary choices.
Computational Cognitive Neuroscience
"What I cannot create, I do not understand." – Richard Feynman This course applies Richard Feynman's dictum to the brain, by teaching students how to simulate brain function with computer programs. Special emphasis will be placed on how neurobiological mechanisms give rise to cognitive processes like learning, memory, decision-making, and object perception. Students will learn how to understand experimental data through the lens of computational models, and ultimately how to build their own models.
human brain with equations on it an pixalation occuring
Follows trends in modern brain theory, focusing on local recurrent circuits and deep multi-stage architectures. Explores the relation between network structure, dynamics, and function. Introduces tools from information theory, dynamical systems, statistics, statistical physics, and learning theory in the study of experience-dependent neural computation. Specific topics include: computational principles of early sensory systems; unsupervised, supervised and reinforcement learning; attractor computation and memory in recurrent cortical circuits; noise, chaos, and coding in neuronal systems; learning and computation in deep neural networks in the brain and in AI systems.