Development of Intelligence

Research Thrust: Development of Intelligence

Josh TenenbaumThis series of lectures provides an overview of key insights about the development of cognitive reasoning abilities in infants and young children, and the contrast between aspects of human cognitive development and current approaches to machine learning used in many AI systems. Some of these distinctions are highlighted in the friendly debate between Laura Schulz and Tomer Ullman. The lectures by Joshua Tenenbaum and Sam Gershman elaborate on probabilistic approaches to modeling learning and cognitive development, including Bayesian methods. The e-Book by Noah Goodman and Joshua Tenenbaum, Probabilistic Models of Cognition, provides an overview of probabilistic approaches to modeling cognitive development and introduces the Church programming language.

Presentations

Laura Schulz: The Origins of Inquiry: Inference and Exploration in Early Childhood

Laura Schulz: The Origins of Inquiry: Inference and Exploration in Early Childhood

Topics: Brief historical overview of key research that revolutionized the study of cognitive development; analogies between how scientists and children learn; overview of studies showing that (1) childrens’ generalizations depend on how evidence is sampled (Gweon, Tenenbaum, Schulz, PNAS 2010), (2) children infer the relative probability of hypotheses and choose interventions most likely to achieve desired outcomes (Gweon, Schulz, Science 2011), and (3) children isolate variables to distinguish competing hypotheses (Cook, Goodman, Schulz, Cognition 2011); If children are so smart why is learning so hard? Because of (1) limited image processing capabilities, (2) limited world knowledge, and because (3) inductive biases constrain learning (Schulz, Bonawitz, Griffiths, Developmental Psychology 2007; Bonawitz, Fischer, Schulz, J. Cognition & Development 2012; Bonawitz, VanSchijndel, Friel, Schulz, Cognitive Psychology 2012; Schulz, Goodman, Tenenbaum, Jenkins, Cognition 2008))

Laura Schulz: Cognitive Development and Commonsense Reasoning, Part 1

Laura Schulz: Cognitive Development and Commonsense Reasoning, Part 1

Topics: Historical perspective on underestimating the challenge of commonsense intelligence in AI; studying children may provide key insights; early representations of objects (e.g. object permanence, Spelke objects, expectations of object behavior), causality, agents and goals (e.g. discriminating objects vs. agents, intentional actions), and learning; drawing rich, abstract inferences from sparse data is critical to rapid learning

Laura Schulz: Cognitive Development and Commonsense Reasoning, Part 2, & Joshua Tenenbaum: Machine ...

Laura Schulz: Cognitive Development and Commonsense Reasoning, Part 2, & Joshua Tenenbaum: Machine vs. Human Learning and Development of Intuitive Physics

Topics: (Laura Schulz) Inferential economics; learning from instruction vs. exploration (Gweon, Pelton, Schulz, Cognitive Science 2011); rational learning through play
(Joshua Tenenbaum): General introduction to the CBMM research thrust on Development of Intelligence; introduction to the concept of probabilistic programs; learning as theory building; learning physics from dynamic scenes (Ullman, Stuhlmuller, Goodman, Tenenbaum, in prep 2014); hierarchical modeling approach, from meta-theories to theories to events; stages of childrens’ development of intuitive physics concepts

Tomer Ullman vs. Laura Schulz debate: Theories, Imagination, and the Generation of New Ideas

Tomer Ullman vs. Laura Schulz debate: Theories, Imagination, and the Generation of New Ideas

Topics: (Tomer Ullman) What good is a theory; problem of search in theory space; stochastic search and relevance to cognitive development
(Laura Schulz) Issues with stochastic search: the search space is infinite and does not make use of knowledge and abilities that children seem to have; proposal for goal-oriented hypothesis generation; what does it mean to think of a new idea?
(Tomer Ullman) Response to the critique of stochastic search

Sam Gershman: Structure Learning, Clusters, Features, and Functions

Sam Gershman: Structure Learning, Clusters, Features, and Functions

Topics: Basic introduction to parameter learning, structure learning, nonparametric Bayes, mixture models, conditioning as clustering, learning relational concepts, multi-level category learning, latent feature models, function learning, Gaussian processes, and human function learning

Sam Gershman (continuation of previous talk), & Josh Tenenbaum: Bayesian Inference

Sam Gershman (continuation of previous talk), & Josh Tenenbaum: Bayesian Inference

Topics: (Sam Gershman) Application of Bayesian learning to motion perception; automatic structure learning
(Joshua Tenenbaum) Learning to learn: hierarchical Bayes; empirical studies of word learning and the relevant object features; transfer of concepts to real-world vocabulary learning; inductive biases; learning about feature variability; hierarchical Bayesian model that accounts for empirical observations (Kemp, Perfors, Tenenbaum, Developmental Science 2007; Salakhutdinov, Tenenbaum, Torralba, ICML 2010); learning structural forms with hierarchical Bayes (Kemp, Tenenbaum, PNAS 2008); learning the form of matrix decompositions (Grosse, Salakhutdinov, Freeman, Tenenbaum, UAI 2012))