[R] using virtual memory in R (Tom Allen)

Thomas Allen hedbag at gmail.com
Mon Jan 19 03:20:54 CET 2009


hi

are you using a 64bit system? 32 bit systems can only allocate about
3GB to a single process.
http://msdn.microsoft.com/en-gb/library/aa366778.aspx
I used to use 32bit winXP, then moved to 64bit Ubuntu8.04 to solve my
memory problems.

if you are 64bit, you should try playing around with the command line
arguments for executing R and also look
at the functions ?memory.limit and ?memory.size

>
>Im using R-2.8.1 on windows vista and have 4GB RAM. Im trying to run LDA
>from the MASS package on a fairly large dataset and keep running out of
>memory ("Cannot allocate vector of size ...)
>
>Ive tried freeing up as much memory as possible with gc(). I tried using the
>ff package but that would need modifications to the way LDA accesses the
>memory mapped data. The error I get is "invalid 'type' (list) for variable
>..."
>
>Basically, I need someone's help on how I can force R to use the large
>swaths of empty space on my hard drive as virtual memory. As I understand
>it, virtual memory increases the address space. So, why isnt R capable of
>using my hard drive. I dont mind the hit in performance. How do I get this
>to work? And if I cant why cant I?
>




More information about the R-help mailing list