Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy

Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy

Date Posted:  November 27, 2019
Date Recorded:  October 29, 2019
Speaker(s):  Thomas Icard
  • All Captioned Videos
  • Brains, Minds and Machines Seminar Series
Description: 

Thomas Icard, Stanford

Abstract: How might we assess the expressive capacity of different classes of probabilistic generative models? The subject of this talk is an approach that appeals to machines of increasing strength (finite-state, recursive, etc.), or equivalently, by probabilistic grammars of increasing complexity, giving rise to a probabilistic version of the familiar Chomsky hierarchy. Many common probabilistic models — hidden Markov models, generative neural networks, probabilistic programming languages, etc. — naturally fit into the hierarchy. The aim of the talk is to give as comprehensive a picture as possible of the landscape of distributions that can be expressed at each level in the hierarchy. Of special interest is what this pattern of results might mean for cognitive modeling.