[R] Alternatives to merge for large data sets?
Prof Brian Ripley
ripley at stats.ox.ac.uk
Thu Sep 7 10:57:58 CEST 2006
Which version of R?
Please try 2.4.0 alpha, as it has a different and more efficient
algorithm for the case of 1-1 matches.
On Wed, 6 Sep 2006, Adam D. I. Kramer wrote:
> Hello,
>
> I am trying to merge two very large data sets, via
>
> pubbounds.prof <-
> merge(x=pubbounds,y=prof,by.x="user",by.y="userid",all=TRUE,sort=FALSE)
>
> which gives me an error of
>
> Error: cannot allocate vector of size 2962 Kb
>
> I am reasonably sure that this is correct syntax.
>
> The trouble is that pubbounds and prof are large; they are data frames which
> take up 70M and 11M respectively when saved as .Rdata files.
>
> I understand from various archive searches that "merge can't handle that,"
> because merge takes n^2 memory, which I do not have.
Not really true (it has been changed since those days). Of course, if you
have multiple matches it must do so.
> My question is whether there is an alternative to merge which would carry
> out the process in a slower, iterative manner...or if I should just bite the
> bullet, write.table, and use a perl script to do the job.
>
> Thankful as always,
> Adam D. I. Kramer
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list