[R] Memory limit problem

Chris Howden chris at trickysolutions.com.au
Tue Oct 12 07:55:53 CEST 2010


Hi Daniel,

There are a number of ways to deal with data without forcing them into
RAM.

If your comfortable with SQL the easiest way might be to use sqldf to join
them using a SQL select query. Try googling "Handling large(r) datasets in
R" Soren Hojsgaard.

Or if u definitely only want to do a cbind and not a merge U might be able
to use one of the following packages. These store the data on disk (rather
than RAM) and might allow u to cbind them.
	Filehash
	Ff




Chris Howden
Founding Partner
Tricky Solutions
Tricky Solutions 4 Tricky Problems
Evidence Based Strategic Development, IP development, Data Analysis,
Modelling, and Training
(mobile) 0410 689 945
(fax / office) (+618) 8952 7878
chris at trickysolutions.com.au

-----Original Message-----
From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org]
On Behalf Of Daniel Nordlund
Sent: Tuesday, 12 October 2010 3:00 PM
To: r-help at r-project.org
Subject: Re: [R] Memory limit problem

> -----Original Message-----
> From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org]
> On Behalf Of David Winsemius
> Sent: Monday, October 11, 2010 10:07 PM
> To: Tim Clark
> Cc: r help r-help
> Subject: Re: [R] Memory limit problem
>
>
> On Oct 11, 2010, at 11:49 PM, Tim Clark wrote:
>
> > Dear List,
> >
> > I am trying to plot bathymetry contours around the Hawaiian Islands
> > using the
> > package rgdal and PBSmapping.  I have run into a memory limit when
> > trying to
> > combine two fairly small objects using cbind().  I have increased
> > the memory to
> > 4GB, but am being told I can't allocate a vector of size 240 Kb.  I
> > am running R
> > 2.11.1 on a Dell Optiplex 760 with Windows XP.  I have pasted the
> > error message
> > and summaries of the objects below.  Thanks for your help.  Tim
> >
> >
> >>      xyz<-cbind(hi.to.utm,z=b.depth$z)
> > Error: cannot allocate vector of size 240 Kb
>
> You have too much other "stuff".
> Try this:
>
> getsizes <- function() {z <- sapply(ls(envir=globalenv()),
>                                  function(x) object.size(get(x)))
>                 (tmp <- as.matrix(rev(sort(z))[1:10]))}
> getsizes()
>
> You will see a list of the largest objects in descending order. Then
> use rm() to clear out unneeded items.
>
> --
> David,
>
> >
> >> memory.limit()
> > [1] 4000
>
> Seems unlikely that you really have that much space in that 32 bit OS.
<<<snip>>

Yeah, without performing some special incantations, Windows XP will not
allocate more than 2GB of memory to any one process (e.g. R).  And even
with those special incantations, at most you will get no more than about
3.2-3.5 GB.  The other thing to remember is that even if you had more than
enough free space, R requires the free space for an object to be
contiguous.  So if memory was fragmented and you didn't have 240KB of
contiguous memory, it still couldn't allocate the vector.

Hope this is helpful,

Dan

Daniel Nordlund
Bothell, WA USA


______________________________________________
R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



More information about the R-help mailing list