[This article was first published on Econometrics Beat: Dave Giles' Blog, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
“Big Data” = data that come in amounts that are too large for current computer hardware and software to deal with. That sounds like fun!
Norman Nie developed the well known SPSS statistical package in the 1960, and is currently President and CEO of Revolution Analytics, a California company that promotes the use of the R computing environment for handling complex data analysis problems.
This recent piece, based on an interview with Nie, makes some interesting points:
- “….parallelized software running on inexpensive multiprocess computers is the wave of the future for all types of big data computing. But the transition will be slow.” [Yep – that’s what a lot of us are doing now in our simulation work – DG]
- “The US is currently experiencing an acute shortage of mathematicians and others trained in related fields such as statistics.” [And not just the U.S. – DG]
- “Data analytics requires knowledge in multiple fields. For instance, a math major might need some familiarity with social sciences….. [Such as economics – DG]. And candidates with degrees in the social sciences often lack sufficient math training.” [Yep – math. and statistics – DG]
Keep an eye on Revolution Analytics through their blog, Revolutions.
© 2011, David E. Giles
To leave a comment for the author, please follow the link and comment on their blog: Econometrics Beat: Dave Giles' Blog.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.