[R] 1024GB max memory on R for Windows XP?
Liaw, Andy
andy_liaw at merck.com
Fri Feb 20 22:19:32 CET 2004
For vectors and matrices (which are vectors with dim attributes), I believe
the memory need to be contiguous, but not for lists.
You still have not indicated which particular "UNIX" you are using. On the
64-bit Linux we have here with 16GB RAM, the R (compiled as 64-bit
application) process can use nearly all the physical RAM available, without
any setting.
Andy
> From: Jonathan Greenberg
>
>
> Does UNIX R have a similar command, or does it just take as
> much memory as
> it needs? On a related note, does the memory have to be contiguous on
> either type of system? I am not hitting my max memory even
> with the 2gb max
> mem set (I'm not even hitting 1.5gb) -- it is giving me
> errors such as:
>
> Error: cannot allocate vector of size 387873 Kb (I should
> point out that
> this value changes when I simply rerun the previous line).
>
> There isn't really an easy way of getting around this, since
> I'm using the
> prune.tree function -- seeing as how its 1 line of code, I
> don't see how to
> optimize this.
>
> --j
>
> On 2/20/04 5:40 AM, "James MacDonald" <jmacdon at med.umich.edu> wrote:
>
> > --max-mem-size=2000M
>
>
> --
> Jonathan Greenberg
> Graduate Group in Ecology, U.C. Davis
> http://www.cstars.ucdavis.edu/~jongreen
> http://www.cstars.ucdavis.edu
> AIM: jgrn307 or jgrn3007
> MSN: jgrn307 at msn.com or jgrn3007 at msn.com
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
------------------------------------------------------------------------------
Notice: This e-mail message, together with any attachments,...{{dropped}}
More information about the R-help
mailing list