[R] memory allocation problem
Lorenzo Cattarino
l.cattarino at uq.edu.au
Wed Nov 3 05:22:11 CET 2010
Thanks for all your suggestions,
This is what I get after removing all the other (not useful) objects and
run my code:
> getsizes()
[,1]
org_results 47240832
myfun 11672
getsizes 4176
SS 3248
coeff 168
<NA> NA
<NA> NA
<NA> NA
<NA> NA
<NA> NA
> est_coeff <- optim(coeff,SS, steps=org_results$no.steps,
Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
p=org_results$p)
Error: cannot allocate vector of size 5.0 Mb
In addition: Warning messages:
1: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range, :
Reached total allocation of 4055Mb: see help(memory.size)
2: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range, :
Reached total allocation of 4055Mb: see help(memory.size)
3: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range, :
Reached total allocation of 4055Mb: see help(memory.size)
4: In optim(coeff, SS, steps = org_results$no.steps, Range =
org_results$Range, :
Reached total allocation of 4055Mb: see help(memory.size)
>
It seems that R is using all the default availabe memory (4 GB, which is
the RAM of my processor).
> memory.limit()
[1] 4055
> memory.size()
[1] 4049.07
>
My dataframe has a size of 47240832 bytes, or about 45 Mb. So it should
not be a problem in terms of memory usage?
I do not understand what is going on.
Thanks for your help anyway
Lorenzo
-----Original Message-----
From: David Winsemius [mailto:dwinsemius at comcast.net]
Sent: Wednesday, 3 November 2010 12:48 PM
To: Lorenzo Cattarino
Cc: r-help at r-project.org
Subject: Re: [R] memory allocation problem
Restart your computer. (Yeah, I know that what the help-desk always
says.)
Start R before doing anything else.
Then run your code in a clean session. Check ls() oafter starte up to
make sure you don't have a bunch f useless stuff in your .Rdata
file. Don't load anything that is not germane to this problem. Use
this function to see what sort of space issues you might have after
loading objects:
getsizes <- function() {z <- sapply(ls(envir=globalenv()),
function(x) object.size(get(x)))
(tmp <- as.matrix(rev(sort(z))[1:10]))}
Then run your code.
--
David.
On Nov 2, 2010, at 10:13 PM, Lorenzo Cattarino wrote:
> I would also like to include details on my R version
>
>
>
>> version _
>
> platform x86_64-pc-mingw32
> arch x86_64
>
> os mingw32
> system x86_64, mingw32
> status
> major 2
> minor 11.1
> year 2010
> month 05
> day 31
> svn rev 52157
> language R
> version.string R version 2.11.1 (2010-05-31)
>
> from FAQ 2.9
>
(http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
> e-a-limit-on-the-memory-it-uses_0021
>
<http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-b
> e-a-limit-on-the-memory-it-uses_0021> ) it says that:
> "For a 64-bit build, the default is the amount of RAM"
>
> So in my case the amount of RAM would be 4 GB. R should be able to
> allocate a vector of size 5 Mb without me typing any command (either
> as
> memory.limit() or appended string in the target path), is that right?
>
>
>
> From: Lorenzo Cattarino
> Sent: Wednesday, 3 November 2010 10:55 AM
> To: 'r-help at r-project.org'
> Subject: memory allocation problem
>
>
>
> I forgot to mention that I am using windows 7 (64-bit) and the R
> version
> 2.11.1 (64-bit)
>
>
>
> From: Lorenzo Cattarino
>
> I am trying to run a non linear parameter optimization using the
> function optim() and I have problems regarding memory allocation.
>
> My data are in a dataframe with 9 columns. There are 656100 rows.
>
>> head(org_results)
>
> comb.id p H1 H2 Range Rep no.steps dist aver.hab.amount
>
> 1 1 0.1 0 0 1 100 0
> 0.2528321 0.1393901
>
> 2 1 0.1 0 0 1 100 0
> 0.4605934 0.1011841
>
> 3 1 0.1 0 0 1 100 4
> 3.4273670 0.1052789
>
> 4 1 0.1 0 0 1 100 4
> 2.8766364 0.1022138
>
> 5 1 0.1 0 0 1 100 0
> 0.3496872 0.1041056
>
> 6 1 0.1 0 0 1 100 0
> 0.1050840 0.3572036
>
>> est_coeff <- optim(coeff,SS, steps=org_results$no.steps,
> Range=org_results$Range, H1=org_results$H1, H2=org_results$H2,
> p=org_results$p)
>
> Error: cannot allocate vector of size 5.0 Mb
>
> In addition: Warning messages:
>
> 1: In optim(coeff, SS, steps = org_results$no.steps, Range =
> org_results$Range, : Reached total allocation of 10000Mb: see
> help(memory.size)
>
> 2: In optim(coeff, SS, steps = org_results$no.steps, Range =
> org_results$Range, : Reached total allocation of 10000Mb: see
> help(memory.size)
>
> 3: In optim(coeff, SS, steps = org_results$no.steps, Range =
> org_results$Range, : Reached total allocation of 10000Mb: see
> help(memory.size)
>
> 4: In optim(coeff, SS, steps = org_results$no.steps, Range =
> org_results$Range, : Reached total allocation of 10000Mb: see
> help(memory.size)
>
>> memory.size()
>
> [1] 9978.19
>
>> memory.limit()
>
> [1] 10000
>
>>
>
> I know that I am not sending reproducible codes but I was hoping that
> you could help me understand what is going on. I set a maximum limit
> of
> 10000 mega byte (by writing this string --max-mem-size=10000M after
> the
> target path, right click on R icon, shortcut tab). And R is telling me
> that it cannot allocate a vector of size 5 Mb???
>
>
David Winsemius, MD
West Hartford, CT
More information about the R-help
mailing list