Estimating pi using the method of moments
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Happy Pi Day! I don’t encounter very much in my area of statistics, so this post might seem a little forced… In this post, I’m going to show one way to estimate .
The starting point is the integral identity
There are two ways to see why this identity is true. The first is that the integral is simply computing the area of a quarter-circle with unit radius. The second is by explicitly evaluating the integral:
If , then the integral identity means that
Hence, if we take i.i.d. draws from the uniform distribution on , it is reasonable to expect that
The code below shows how well this estimation procedure does for one run as the sample size goes from 1 to 200:
set.seed(1) N <- 200 x <- runif(N) samples <- 4 * sqrt(1 - x^2) estimates <- cumsum(samples) / 1:N plot(1:N, estimates, type = "l", xlab = "Sample size", ylab = "Estimate of pi", main = "Estimates of pi vs. sample size") abline(h = pi, col ="red", lty = 2)
The next plot shows the relative error on the y-axis instead (the red dotted line represents 1% relative error):
rel_error <- abs(estimates - pi) / pi * 100 plot(1:N, rel_error, type = "l", xlab = "Sample size", ylab = "Relative error (%)", main = "Relative error vs. sample size") abline(h = 0) abline(h = 1, col = "red", lty = 2)
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.