[R-sig-Debian] R memory allocation in Linux

Paul Johnson pauljohn32 at gmail.com
Sat Nov 6 21:05:08 CET 2010


On Fri, Nov 5, 2010 at 2:10 PM, ricardo souza <ricsouzabh at yahoo.com.br> wrote:
> Dear all,
>
> I am using ubuntu linux 32 with 4 Gb.  I am running a very small script and I always got the same error message:  CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.


It can't be such a small R program if it is trying to allocate one
vector of 231 mb.   Think for a minute on how much contiguous storage
you are asking for.

I went and read that thread you referred to in r-help. B Ripley
clearly states you should run a 64bit OS if you hope to claim such a
large piece of contiguous memory.

Beyond that, it is hard to say what the fix is.  You can arrive at the
"vector too big" problem in many ways.  Some you can fix, some you can
avoid.  If you have such a small program, you should post it here so
we can at least try it and see what we get?  My first thought would be
re-organize your code so you don't try to ask for such gigantic
vectors.

If that is impossible, in the CRAN, look for the packages that are
supposed to help with big vectors.  As I recally, there is one family
of packages with names like "big..." and another with a name like
"ff...".  These are both trying to work around the "can't get a
gigantic piece of contiguous memory" problem. I would tell you
details, but I'm not able to get an answer from r-project.org right
now, so I can't look them up.

I experimented with these for a day last winter and concluded that
they are still too experimental for me, but they did make a lot of
progress lately.

>
> I have reading carefully the instruction in ?Memory.  Using the function gc() I got very low numbers of memory (please sea below).  I know that it has been posted several times at r-help (http://tolstoy.newcastle.edu.au/R/help/05/06/7565.html#7627qlink2).  However I did not find yet the solution to improve my memory issue in Linux.  Somebody cold please give some instruction how to improve my memory under linux?
>
>> gc()
>          used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 170934  4.6     350000  9.4   350000  9.4
> Vcells 195920  1.5     786432  6.0   781384  6.0
>
> INCREASING THE R MEMORY FOLLOWING THE INSTRUCTION IN  ?Memory
>
> I started R with:
>
> R --min-vsize=10M --max-vsize=4G --min-nsize=500k --max-nsize=900M
>> gc()
>          used (Mb) gc trigger (Mb) limit (Mb) max used (Mb)
> Ncells 130433  3.5     500000 13.4      25200   500000 13.4
> Vcells  81138  0.7    1310720 10.0         NA   499143  3.9
>
> It increased but not so much!
>
> Please, please let me know.  I have read all r-help about this matter, but not solution. Thanks for your attention!
>
> Ricardo
>
>
>
>        [[alternative HTML version deleted]]
>
>
> _______________________________________________
> R-SIG-Debian mailing list
> R-SIG-Debian at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-debian
>
>



-- 
Paul E. Johnson
Professor, Political Science
1541 Lilac Lane, Room 504
University of Kansas



More information about the R-SIG-Debian mailing list