Basic MCMC and Bayesian statistics in… BASIC!
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
The BASIC programming language was at one point the most widely spread programming language. Many home computers in the 80s came with BASIC (like the Commodore 64 and the Apple II), and in the 90s both DOS and Windows 95 included a copy of the QBasic IDE. QBasic was also the first programming language I encountered (I used it to write a couple of really horrible text adventures). Now I haven’t programmed in BASIC for almost 20 years and I thought I would revisit this (from my current perspective) really weird language. As I spend a lot of time doing Bayesian data analysis, I though it would be interesting to see what a Bayesian analysis would look like if I only used the tool that I had 20 years ago, that is, BASIC.
This post walks through the implementation of the Metropolis-Hastings algorithm, a standard Markov chain Monte Carlo (MCMC) method that can be used to fit Bayesian models, in BASIC. I then use that to fit a Laplace distribution to the most adorable dataset that I could find: The number of wolf pups per den from a sample of 16 wold dens. Finally I summarize and plot the result, still using BASIC. So, the target audience of this post is the intersection of people that have programmed in BASIC and are into Bayesian computation. I’m sure you are out there. Let’s go!
Getting BASIC
There are many many different version of BASIC, but I’m going for the one that I grew up with: Microsoft QBasic 1.1 . Now, QBasic is a relatively new BASIC dialect that has many advanced features such as user defined types and (gasp!) functions. But I didn’t use any fancy functions back in the 90s, and so I’m going to write old school BASIC using line numbers and GOTO
, which means that the code should be relatively easy to port to, say, Commodore 64 BASIC.
Getting QBasic is easy as it seems to be freeware, and can be downloaded here. Unless you are still using DOS, the next step would be to install the DOSBox emulator. Once you’ve started up QBASIC.EXE
you are greeted with a friendly, bright blue IDE which you can try out by entering this customary hello world script that will clear the screen and print “HELLO WORLD”.
Note that in QBasic the line numbers are not strictly necessary, but in older BASICs (like the one for Commodore 64) they were necessary as the program was executed in the order given by the line numbers.
What are we going to implement?
We are going to implement the Metropolis-Hastings algorithm, a classic Markov chain Monte Carlo (MCMC) algorithm which is one of the first you’ll encounter if you study computational methods for fitting Bayesian models. This post won’t explain the actual algorithm, but the Wikipedia article is an ok introduction.
The Bayesian model we are going to implement is simple univariate Laplace distribution (just to once in awhile give the Normal distribution a day off). The Laplace is similar to the Normal distribution in that it is continuous and symmetric, but it is more peaked and has wider tails. It has two parameters: A location $mu$, which then also defines its mean and median, and a scale $b$ which defines the width of the distribution.
Image source, Credits: IkamusumeFan
Like the sample mean is the maximum likelihood estimator for the mean of a Normal distribution, the sample median is the maximum likelihood estimator for the location parameter $mu$ of a Laplace distribution. That’s why I, somewhat sloppily, think of the Normal distribution as the “mean” distribution and of the Laplace distribution as the “median” distribution. To turn this into a fully Bayesian models we need prior distributions over the two parameters. Here I’m just going to be sloppy and use a $text{Uniform}(-infty, infty)$ over $mu$ and $log(b)$, that is, $text{P}(mu),text{P}(log(b)) propto 1$. The full model is then
$$ x sim text{Laplace}(mu, b) \ mu sim text{Uniform}(-infty, infty) \ log(b) sim text{Uniform}(-infty, infty) \$$
The data we are going to use is probably the cutest dataset I’ve worked with so far. It consists of counts of the number of wolf pups in a sample of 16 wolf dens (source):
Image Source, Credits: spacebirdy / CC-BY-SA-3.0
A reference implementation in R
Before delving into BASIC, here is a reference implementation in R of what we hope to achieve using BASIC:
# The wolf pups dataset x <- c(5, 8, 7, 5, 3, 4, 3, 9, 5, 8, 5, 6, 5, 6, 4, 7) # The log posterior density of the Laplace distribution model, when assuming # uniorm/flat priors. The Laplace distribution is not part of base R but is # available in the VGAM package. model <- function(pars) { sum(VGAM::dlaplace(x, pars[1], exp(pars[2]), log = TRUE)) } # The Metropolis-Hastings algorithm using a Uniform(-0.5, 0.5) proposal distribution metrop <- function(n_samples, model, inits) { samples <- matrix(NA, nrow = n_samples, ncol = length(inits)) samples[1,] <- inits for(i in 2:n_samples) { curr_log_dens <- model(samples[i - 1, ]) proposal <- samples[i - 1, ] + runif(length(inits), -0.5, 0.5) proposal_log_dens <- model(proposal) if(runif(1) < exp(proposal_log_dens - curr_log_dens)) { samples[i, ] <- proposal } else { samples[i, ] <- samples[i - 1, ] } } samples } samples <- metrop(n_samples = 1000, model, inits = c(0,0)) # Plotting a traceplot plot(samples[,1], type = "l", ylab = expression(Location ~ mu), col = "blue") # Calculating median posterior and 95% CI discarding the first 250 draws as "burnin". quantile(samples[250:1000,1], c(0.025, 0.5, 0.975))
## 2.5% 50% 97.5% ## 4.489 5.184 6.144