[R] Memory allocation in 64 bit R
Uwe Ligges
ligges at statistik.tu-dortmund.de
Sat Oct 2 15:28:08 CEST 2010
On 02.10.2010 03:10, Peter Langfelder wrote:
> Hi Mete,
>
> I think you should look at the help for memory.limit. Try to set a
> higher one, for example
>
> memory.limit(16000)
>
> (I think 16GB is what xenon will take).
But not too funny given you have only 8Gb in your machine.
So the answer probably is: Buy more RAM or try to reduce the problem.
> Peter
>
> On Fri, Oct 1, 2010 at 6:02 PM, Mete Civelek<mcivelek at mednet.ucla.edu> wrote:
>> Hi Everyone,
>>
>> I am getting the following error message
>>
>> Error: cannot allocate vector of size 2.6 Gb
So just the next step is about allocating 2.6 Gb! Note that you had only
2.6Gb free at all given the information you specified below. Hence it
won't work on any OS, if you limit R to 8Gb.
Uwe Ligges
>> In addition: Warning messages:
>> 1: In dim(res$res) = dim(bi) :
>> Reached total allocation of 8122Mb: see help(memory.size)
>> 2: In dim(res$res) = dim(bi) :
>> Reached total allocation of 8122Mb: see help(memory.size)
>> 3: In dim(res$res) = dim(bi) :
>> Reached total allocation of 8122Mb: see help(memory.size)
>> 4: In dim(res$res) = dim(bi) :
>> Reached total allocation of 8122Mb: see help(memory.size)
>>
>> Here is the relevant info
>>
>>> sessionInfo()
>> R version 2.11.1 (2010-05-31)
>> x86_64-pc-mingw32
>>
>> locale:
>> [1] LC_COLLATE=English_United States.1252
>> [2] LC_CTYPE=English_United States.1252
>> [3] LC_MONETARY=English_United States.1252
>> [4] LC_NUMERIC=C
>> [5] LC_TIME=English_United States.1252
>>
>> attached base packages:
>> [1] splines tcltk stats graphics grDevices utils datasets
>> [8] methods base
>>
>> other attached packages:
>> [1] cluster_1.12.3 WGCNA_0.93 Hmisc_3.8-2
>> [4] survival_2.35-8 qvalue_1.22.0 flashClust_1.00-2
>> [7] dynamicTreeCut_1.21 impute_1.22.0
>>
>> loaded via a namespace (and not attached):
>> [1] grid_2.11.1 lattice_0.19-11 tools_2.11.1
>>
>>> memory.size(NA)
>> [1] 8122.89
>>> memory.size()
>> [1] 5443.18
>>> memory.limit()
>> [1] 8122
>>> .Machine$sizeof.pointer
>> [1] 8
>>
>> And this is what I am trying to do when I get this error message
>>> ls()
>> [1] "datExpr"
>>> print(object.size(datExpr), units = "auto")
>> 23.5 Mb
>>> ADJ1=((1+bicor(datExpr, use="pairwise.complete.obs", maxPOutliers=0.05, quick=0, pearsonFallback="individual"))/2)^8
>>
>> If I understand the archives correctly my problem is with memory allocation of a large vector to the address space. Is there any way to get around this without having to use a Linux system? Has anyone been able to solve this problem?
>>
>> I appreciate any suggestions or help.
>>
>> Mete Civelek
>>
>> ________________________________
>> IMPORTANT WARNING: This email (and any attachments) is o...{{dropped:12}}
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list