Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
You might think that doing advanced statistical analysis on Big Data is out of reach for those of us without access to expensive hardware and software. For example, back in April SAS was proud to demonstrate being able to run logistic regression on a billion records (and "just a few" variables) in less than 80 seconds. But that feat required some serious hardware: two racks of Greenplum's Data Computing Appliance (DCA). Each rack of the DCA has 16 servers, 192 Intel cores, and 768 GB of RAM, and pricing starts at $1 million. Add on SAS license fees for its High Performance Analytics suite, and you're talking serious money.
We're currently beta testing Revolution R Enterprise 5.0, which includes new features for using the power of a cluster of commodity hardware machines running Windows Server to perform statistical analysis on huge data sets. In the video below, Revolution Analytics' Sue Ranney takes the beta for a spin, and uses the RevoScaleR package to run a logistic regression on 1.2 billion records of data on our 5-node cluster:
For comparison, each of the five nodes in our cluster has 16 GB of RAM with an Intel Xeon E3-1230, 3.2Ghz 8M cache quad-core processor, and a 1 TB hard drive. Total hardware cost: around $5,000. All the machines are running Windows Server 2008 with the Windows HPC Pack and Revolution R Enterprise 5.0 beta 1.
And the time for that 1.2 billion row regression? 75 seconds: just as fast, and at less than 1% of the hardware cost. See the details in the video linked below.
Revolution Analytics YouTube Channel: Logistic Regression in R with a Billion Rows
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.