Learning physical parameters from dynamic scenes.

TitleLearning physical parameters from dynamic scenes.
Publication TypeJournal Article
Year of Publication2018
AuthorsUllman, TD, Stuhlmüller, A, Goodman, ND, Tenenbaum, JB
JournalCognitive Psychology
Volume104
Pagination57-82
Date Published8/2018
Keywordsintuitive physics, intuitive theory, learning, physical reasoning, probabilistic inference
Abstract

Humans acquire their most basic physical concepts early in development, and continue to enrich and expand their intuitive physics throughout life as they are exposed to more and varied dynamical environments. We introduce a hierarchical Bayesian framework to explain how people can learn physical parameters at multiple levels. In contrast to previous Bayesian models of theory acquisition (Tenenbaum et al., 2011), we work with more ex- pressive probabilistic program representations suitable for learning the forces and properties that govern how objects interact in dynamic scenes unfolding over time. We compare our model to human learners on a challenging task of estimating multiple physical parameters in novel microworlds given short movies. This task requires people to reason simultane- ously about multiple interacting physical laws and properties. People are generally able to learn in this setting and are consistent in their judgments. Yet they also make systematic errors indicative of the approximations people might make in solving this computationally demanding problem with limited computational resources. We propose two approximations that complement the top-down Bayesian approach. One approximation model relies on a more bottom-up feature-based inference scheme. The second approximation combines the strengths of the bottom-up and top-down approaches, by taking the feature-based inference as its point of departure for a search in physical-parameter space.

URLhttps://www-sciencedirect-com.libproxy.mit.edu/science/article/pii/S0010028517301822
DOI10.1016/j.cogpsych.2017.05.006

Research Area: 

CBMM Relationship: 

  • CBMM Funded