[R] bigmemory package woes

zerdna azege at yahoo.com
Fri Apr 23 23:51:32 CEST 2010


I have pretty big data sizes, like matrices of .5 to 1.5GB so once i need to
juggle several of them i am in need of disk cache. I am trying to use
bigmemory package but getting problems that are hard to understand. I am
getting seg faults and machine just hanging. I work by the way on Red Hat
Linux, 64 bit R version 10. 

Simplest problem is just saving matrices. When i do something like 

r<-matrix(rnorm(100), nr=10); librarybigmemory)
for(i in 1:3) xx<-as.big.matrix(r, backingfile=paste("r",i, sep="",
collapse=""), backingpath=MyDirName)

it works just fine -- saves small matrices  as three different matrices on
disc. However, when i try it with real size, like 

with r<-matrix(normal(50000000), nr=1000)

I am either getting seg fault on saving the third big matrix, or hang
forever.

Am i doing something obviously wrong, or is it an unstable package at the
moment? Could anyone recommend something similar that is reliable in this
case?
-- 
View this message in context: http://r.789695.n4.nabble.com/bigmemory-package-woes-tp2062996p2062996.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list