Bayesian stats, Hierarchical models

Some notes related to MvR’s presentation on Bayesian stats and

Author

Denis Schluppeck

Published

2024-12-18

Basics

The posterior distribution of the parameter(s) \(\theta\) is the product of the likelihood of the data (given the parameters) and the prior* of the parameter divided by a normalising term.

\[ p(\theta \mid \mathcal{D}) = \frac{p( \mathcal{D} \mid \theta) p(\theta)}{p( \mathcal{D})} \]

where

  • \(p(\theta \mid \mathcal{D})\) is the posterior
  • \(p( \mathcal{D} \mid \theta)\), the likelihood
  • \(p(\theta)\), the prior and
  • \(p(\mathcal{D})\), a normalising term, summing over \(p(\mathcal{D} \mid \theta ) p(\theta )\) for all possible values of \(p(\theta)\)

Notes on MvR’s presentation

Python, PyMC

You can learn about PyMC and Bayesian Modelling on the project website.

Julia, Turing.jl

If you prefer julia, you can dig into the following package, which looks very mature: https://turinglang.org/ (eg Zoubin Ghahramani is on the team for this project 😎)

For a #julialang-tinged introduction into Bayesian Inference, watch a presentation on Turing.jl.

I have also had a play around with a linear regression example, which is not a huge jump for Matlab users and is easy to translate from model spec equations to code!

Julia example code.

Nice diagnostic plots out of the box

… and nice display tables for values for the MCMC diagnostics and summaries:

MCMC diagnostics

Reading

Bishop, Christopher M. 2007. Pattern Recognition and Machine Learning (Information Science and Statistics). 1st ed. Springer.
Gelman, Andrew, Aki Vehtari, Daniel Simpson, Charles C. Margossian, Bob Carpenter, Yuling Yao, Lauren Kennedy, Jonah Gabry, Paul-Christian Bürkner, and Martin Modrák. 2020. “Bayesian Workflow.” https://arxiv.org/abs/2011.01808.