Last week, I gave a presentation about the concept of and intuition behind probabilistic programming and model-based machine learning in front of a general audience. You can read my extended notes here.
Drawing on ideas from Winn and Bishop’s “Model-Based Machine Learning” and van de Meent et al.’s “An Introduction to Probabilistic Programming”, I try to show why the combination of a data-generating process with an abstracted inference is a powerful concept by walking through the example of a simple survival model.
Find my extended presentation notes here.
Preparing this document gave me an excuse to try the wonderful Tufte handout style for R Markdown documents by JJ Allaire and Yihui Xie. Furthermore, inspired by the work of Jessica Hullman, Matthew Kay, and Claus Wilke, I gave my best shot at creating hypothetical outcome plots to visualize the uncertainty induced by the prior distribution. I couldn’t have done it without David Robinson’s and Thomas Lin Pedersen’s gganimate package.