CBMM faculty offer graduate and undergraduate courses that integrate computational and empirical approaches used in the study of problems related to intelligence. These courses introduce some of the mathematical frameworks used to formulate computational models, and experimental methods used in the fields of neuroscience and cognitive science to study the neural implementations of intelligent processes and manifestation of these computations in human cognitive behavior. Examples of the integration of these perspectives are drawn from current research on intelligence. Materials for many of these courses are available online. Most graduate courses are open to advanced undergraduates with appropriate background. Enrollment for courses is handled through the respective institutions.

Fall 2014

Massachusetts Institute of Technology (MIT)

Computational Aspects of Biological Learning
Takes a computational approach to learning in the brain by neurons and synapses. Examines supervised and unsupervised learning as well as possible biological substrates, including Hebb synapses and the related topics of Oja flow and principal components analysis. Discusses hypothetical computational primitives in the nervous system, and the implications for unsupervised learning algorithms underlying the development of tuning properties of cortical neurons. Also focuses on a broad class of biologically plausible learning strategies.
Spring 2014

Massachusetts Institute of Technology (MIT), Harvard University

Computational Models and Cognitive Development
Explores the prospects for “reverse engineering” infant and early childhood cognition over the first three years of life, with the goal of laying the foundations for a computational account of what children know and how they come to know it, expressed in the language of contemporary engineering approaches to intelligence. Focuses on core knowledge systems, such as core intuitive physics, psychology, sociology, space and number, as well as the learning mechanisms that extend, enrich and transform these core systems as children grow. Integrates related research from cognitive neuroscience and comparative studies of cognition in non-human species.

Harvard University

Computational Learning Theory
Possibilities of and limitations to performing learning by computational agents. Topics include computational models, polynomial time learnability, learning from examples and learning from queries to oracles. Applications to Boolean functions, automata and geometric functions.

University of California, Los Angeles (UCLA)

Introduction to Pattern Recognition and Machine Learning
Introduction to pattern analysis and machine intelligence designed for advanced undergraduate and graduate students. Topics include Bayes decision theory, learning parametric distributions, non-parametric methods, regression, Adaboost, perceptrons, support vector machines, principal components analysis, nonlinear dimension reduction, independent component analysis, K-means analysis, and probability models.
IAP 2014

Massachusetts Institute of Technology (MIT)

Methods for Analyzing Neural Data
Covers methods that are useful for analyzing neural data including conventional statistics, mutual information, point process models and decoding analyses. Emphasis is on explaining the basic mathematical intuitions behind these methods, and giving practical hands-on experience for how these methods can be applied to real data. The class is divided into lectures that explain different methods and laboratory classes where students analyze real data. Examples focus on neural spiking activity but we also discuss other types of signals including MEG signals and local field potentials.
Fall 2013

Massachusetts Institute of Technology (MIT)

Vision and Learning: Computers and Brains
This course reviews and discusses research on the problem of learning to understand the world and interact with it using sensory information. Vision is used as the primary domain, and relevant learning approaches are examined from both computational and biological perspectives. Topics include learning in computational vision, recent advances and limitations of current learning methods, face processing by computers and brain, learning in synapses, reinforcement learning, and Markov decision processes in computers and brains.
Fall 2012

Massachusetts Institute of Technology (MIT)

What is Intelligence?
The problem of intelligence – its nature, how it is produced by the brain and how it could be replicated in machines – is a deep and fundamental problem that cuts across multiple scientific disciplines. Philosophers have studied intelligence for centuries, but it is only in the last several decades that developments in a broad range of science and engineering fields have opened up a thriving “intelligence research” enterprise, making questions such as these approachable.