[This article was first published on Gregor Gorjanc, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
R keeps all the data in RAM. I think I read somewhere that S+ does not hold all the data in RAM, which makes S+ slower than R. On the other hand, when we have a lot of data, R chockes. I know that SAS at some “periods” keeps data (tables) on disk in special files, but I do not know the details of interfacing these files. My overall impression is that SAS is more efficient with big datasets than R, but there are also exceptions, some special packages (see this tutorial for some info) and vibrant development which to my impression takles the problem of large data in the spirit of SAS – I really do not know the details, so please bear with me.
Anyway, what can you do when you hit memory limit in R? Yesterday, I was fitting the so called mixed model using the lmer() function from the lme4 package on Dell Inspiron I1520 laptop having Intel(R) Core(TM) Duo CPU T7500 @ 2.20GHz 2.20GHz and 2046 MB of RAM. I was using MS Windows Vista. The fitting went fine, but when I wanted to summarize the returned object, I got the following error message:
> fit < - lmer(y ~ effect1 + ....) > summary(fit) Error: cannot allocate vector of size 130.4 Mb In addition: There were 22 warnings (use warnings() to see them)
First, I find this very odd, since I would expect that fitting the model should be much more memory consuming in comparison to summarizing the fitted object! I will ask the developers of the lme4 package, but until then I tried to find my way out.
Message “Error: cannot allocate vector of size 130.4 Mb” means that R can not get additional 130.4 Mb of RAM. That is weird since resource manager showed that I have at least cca 850 MB of RAM free. I printe the warnings using warnings() and got a set of messages saying:
> warnings() 1: In slot(from, what) <- slot(value, what) ... : Reached total allocation of 1535Mb: see help(memory.size) ...
This did not make sense since I have 2GB of RAM. I closed all other applications and removed all objects in the R workspace instead of the fitted model object. However, that did not help. I started reading the help page of memory.size and I must confes that I did not understand or find anything usefull. However, reading the help further, I follwed to the help page of memor.limit and found out that on my computer R by default can use up to ~ 1.5 GB of RAM and that the user can increase this limit. Using the following code, helped me to solve my problem.
>memory.limit() [1] 1535.875 > memory.limit(size=1800) > summary(fit)
To leave a comment for the author, please follow the link and comment on their blog: Gregor Gorjanc.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.