Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
One of the most straightforward examples of how we use Bayes to update our beliefs as we acquire more information can be seen with a simple Bernoulli process. That is, a process which has only two possible outcomes.
Probably the most commonly thought of example is that of a coin toss. The outcome of tossing a coin can only be either heads, or tails (barring the case that the coin lands perfectly on edge), but there are many other real world examples of Bernoulli processes. In manufacturing, a widget may come off of the production line either working, or faulty. We may wish to know the probability that a given widget will be faulty. We can solve this using Bayesian updating.
I’ve put together this little piece of R code to help visualize how our beliefs about the probability of success (heads, functioning widget, etc) are updated as we observe more and more outcomes.
## Simulate Bayesian Binomial updating sim_bayes<-function(p=0.5,N=10,y_lim=15) { success<-0 curve(dbeta(x,1,1),xlim=c(0,1),ylim=c(0,y_lim),xlab='p',ylab='Posterior Density',lty=2) legend('topright',legend=c('Prior','Updated Posteriors','Final Posterior'),lty=c(2,1,1),col=c('black','black','red')) for(i in 1:N) { if(runif(1,0,1)<=p) success<-success+1 curve(dbeta(x,success+1,(i-success)+1),add=TRUE) print(paste(success,"successes and ",i-success," failures")) } curve(dbeta(x,success+1,(i-success)+1),add=TRUE,col='red',lwd=1.5) } sim_bayes(p=0.6,N=90)
The result is a plot of posterior (which become the new prior) distributions as we make more and more observations from a Bernoulli process.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.