In this tutorial will look at hierarchical Bayesian models of categorization, using a probabilistic programming language. Generative models can be very conveniently described and simulated in such programming languages. For that reason, they are a great tool for exploring probabilistic models of cognition. The language we use here, WebPPL (“web people”), has partly been developed for that by Noah Goodman and Joshua Tenenbaum. As the name suggest, WebPPL is based on a web programming language: JavaScript. As a result, all code runs directly in your browser. There is no need to install any additional software.

In their online book Probabilistic Models of Cognition, Goodman and Tenenbaum present a wide range of probabilistic cognitive models. The goal of this tutorial is to implement some of the hierarchical Bayesian models from that book.

Important note: The changes you make to the code are lost once you refresh the page. If you want save your code, copy-paste to some text editor. This should not be a problem as you don’t have to code much.

Tip: If you want to play around with WebPPL, you can find an editor and more examples on

Submission. Submit your solutions to the homework exercises a blackboard before the lecture on Thursday (15:00). Please hand in a single PDF file in which you explain your solutions. Also, make sure your answers are easy to find and not hidden between blocks of code!


Since we have only two hours for the actual tutorial, we need to get through the basics quickly. That will work best if you come prepared, even though the preparations shouldn’t take a lot of extra time.

  1. Make sure you have read the tutorial on Bayesian modelling by Amy Perfors et al. (2011). Section 3 (“acquiring inductive constraints”) is particularly relevant: the tutorial implements some of the models discussed there.
  2. JavaScript is not R, so please read this a very brief introduction to JavaScript. As you will see, the syntax is not very different from R.
  3. Optionally, if you are interested in the underlying ideas, read this short, general introduction
  4. Optionally, if you want to be super prepared, read this (excerpt from) the second chapter on generative models. The latter might be particularly useful for those who have never seen any probability theory, but you should also be able to do the tutorial without reading it.

The tutorial

Probabilistic models are often best explained in simple scenarios. In this case, a classic one: bags with coloured marbles. In the first part of the tutorial we will be solely concerned with bags and marbles. Starting from the very basics of probability theory we quickly work towards a hierarchical model. The second part deals with its cognitive interpretation.

  1. Part 1: Bags of marbles
  2. Part 2: Hierarchical models