[R-sig-eco] memory size limits
Jens Åström
jens.astrom at ekol.slu.se
Tue Aug 17 15:46:44 CEST 2010
Hi,
Not sure it will help but have a look at the gc() command as well.
If R recently has used lots of RAM, this command can help free some of it.
/Jens
On 08/17/2010 12:00 PM, r-sig-ecology-request at r-project.org wrote:
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 16 Aug 2010 12:26:50 -0400
> From: "Howe, Eric (MNR)" <eric.howe at ontario.ca>
> To: <r-sig-ecology at r-project.org>
> Subject: [R-sig-eco] memory size limits
> Message-ID:
> <0D6399D2E9A66846B227AE8661541B2912723A at CTSPITDCEMMVX28.cihs.ad.gov.on.ca>
>
> Content-Type: text/plain
>
> Hello,
>
> When trying to fit spatially-explicit capture recapture models with a
> larger number of parameters, using the package secr, I get the following
> error message:
>
>
>
>> gH.yr.sH.jc.session=secr.fit(seeh69, model=list(g0~h2+yr,
> sigma~h2+jc+session),
> mask=masklist,CL=T,detectfn=1,timecov=timecovs,sessioncov=sesscovs,detai
> ls=list(distribution='binomial'), stepmax=10)
>
> Checking data
>
> Preparing detection design matrices
>
> Error: cannot allocate vector of size 150.6 Mb
>
>
>
> I only get the message when trying to fit models with many parameters.
>
> I am working on a PC with 2Gb of RAM and Windows XP installed. I
> already increased the memory limit in R, and the page file size in
> Windows, to the maximum allowed (4095 Mb). I also defragmented my hard
> drive, which has plenty of free space available.
>
>
>
> Can anyone suggest a way around this problem?
>
> Thanks,
>
>
>
> Eric
>
>
>
>
> [[alternative HTML version deleted]]
>
More information about the R-sig-ecology
mailing list