Site icon R-bloggers

See how Deloitte uses R for actuarial analysis

[This article was first published on Revolutions, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Many thanks to Jim Guszcza (Predictive Analytics lead at Deloitte Consulting and Assistant Professor at UW-Madison) who gave a great webinar presentation yesterday on actuarial analysis with R

Jim's demo (starting at the 20 minute mark in the video replay below) is a great way to get a sense of how R is used for exploratory data analysis and modeling, with a live examples of fitting a mixute distribution to bimodal claims data, and calculating loss reserves using Poisson regression.

< embed allowfullscreen="true" allowscriptaccess="always" height="360" src="https://www.youtube.com/v/yWexhOmkVKU?version=3&hl=en_US" type="application/x-shockwave-flash" width="450">  

Many actuaries use Excel to make calculations like these, but Jim makes a great point point about the benefits of programming with data instead of spreadsheet cells at the 32:00 mark:

Just one simple line of [R] code that would work just as well for a 100-by-100 loss triangle as it would for a 10-by-10 triangle. No hidden cells in the spreadsheet, no risk of spreadsheet error. It's a little bit of code you could look at in one screen, it's replicable … and this does all the work that a spreadsheet would do. 

Also, check Jim's final case study at the 47-minute mark for a sneak preview of the next version of Revolution R Enterprise — big-data Generalized Linear Models. He uses the Allstate Claim Prediction Challenge data (from a recent Kaggle competition) to fit a Tweedie model to 13 million records of claim data. (The Tweedie distribution is often used to model insurance claims, where many claims are exactly zero, and non-zero claims follow a continuous Gamma-like distribution.) Using the forthcoming rxGLM function, he fit the model to this large data set in just over two minutes (140.22 seconds) using a single quad-core PC.

You can download the slides from Jim's presentation and an a replay for offline viewing at the link below.

Revolution Analytics Webinars: Actuarial Analytics in R

To leave a comment for the author, please follow the link and comment on their blog: Revolutions.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.