Bayesian Inference in Generative Models

Bayesian Inference in Generative Models

Date Posted: 

November 14, 2018

Date Recorded: 

November 13, 2018

Speaker(s): 

Luke Hewitt, MIT, Maddie Cusimano, MIT
  • Computational Tutorials

Description: 

Speaker: Luke Hewitt, MIT
Talk prepared and Q&A session by: Maddie Cusimano & Luke Hewitt, MIT

Bayesian inference is ubiquitous in models and tools across cognitive science and neuroscience. While the mathematical formulation of Bayesian models in terms of prior and likelihood is simple, exact Bayesian inference is intractable for most models of interest. In this tutorial, we will cover a range of approximate inference methods, including sampling-based methods (e.g. MCMC, particle filters) and variational inference, and describe how neural networks can be used to speed up these methods. We will also introduce probabilistic programming languages, which provide tools for black-box Bayesian inference in complex models. Hands-on exercises include implementing inference algorithms for simple models and/or implementing complex models in a probabilistic programming language.

Additional Info:

Slides, references, and exercises: https://stellar.mit.edu/S/project/bcs-comp-tut/materials.html