Bayesian Inference in Generative Models (49:45)
- Computational Tutorials
Speaker: Luke Hewitt, MIT
Talk prepared and Q&A session by: Maddie Cusimano & Luke Hewitt, MIT
Bayesian inference is ubiquitous in models and tools across cognitive science and neuroscience. While the mathematical formulation of Bayesian models in terms of prior and likelihood is simple, exact Bayesian inference is intractable for most models of interest. In this tutorial, we will cover a range of approximate inference methods, including sampling-based methods (e.g. MCMC, particle filters) and variational inference, and describe how neural networks can be used to speed up these methods. We will also introduce probabilistic programming languages, which provide tools for black-box Bayesian inference in complex models. Hands-on exercises include implementing inference algorithms for simple models and/or implementing complex models in a probabilistic programming language.
Slides:
Additional resources:
have an interactive transcript feature enabled, which appears below the video when playing. Viewers can search for keywords in the video or click on any word in the transcript to jump to that point in the video. When searching, a dark bar with white vertical lines appears below the video frame. Each white line is an occurrence of the searched term and can be clicked on to jump to that spot in the video.
LEARNING