[R] lm Regression takes 24+ GB RAM - Error message
Jonas125
schleeberger.j at pg.com
Wed Mar 6 10:51:27 CET 2013
Hello,
I am a rather unexperienced r-user (learned the language 1 month ago) and
run into the following problem using a local computer with 6 cores & 24 GB
RAM and R 2.15 64-bit. I didn't install any additional packages
1. Via the read.table command I load a data table (with different data
types) which is about 730 MB large
2. I add 2 calculated columns
3. I split the dataset by 5 criteria
4. I run the lm command on the split with the calculated columns as the
variables
The RAM consumption goes rapidly up and stays at 24 GB for a couple of
minutes.
The result:
Error: cannot allocate vector size of 5.0 Mb
In addition: There ware 50 or more warnings (use warnings() to see the first
50)
--> Reached total allocation of 24559Mb
My code works perfectly fine for a smaller dataset. I am surprised about the
errors as the CPU should do all the work with the lm calculations and the
output cannot be that large, can it??? (I cannot check the object size of
the lm object due to the error)
Right now I am running only 1 linear model, but actually I wanted to run 6!
Is Windows putting some restrictions on R regarding the RAM usage? Can I
change any settings?
A RAM upgrade is not an option. Do I need to use a different R package
instead (bigmemory?)?
Thanks in advance for your help!!
--
View this message in context: http://r.789695.n4.nabble.com/lm-Regression-takes-24-GB-RAM-Error-message-tp4660434.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list