[R] The use of R for the elaboration of some index
Diego Moretti
dimorett at istat.it
Wed Feb 11 10:48:04 CET 2004
Hello,
Our Statistics Group is evaluating the use of R for the elaboration of
some index.
We have some datasets sas (120 Mb) and we would like to evaluate
performance in the elaborations of mean, percentile, Gini index of a
population and of a survey sample.
I need to open "a dataset". Currently I've understood that I've to
follow a code sequence like this:
alfa <- {a moltiplicator}
memory.limit(alfa*round(memory.limit()/1048576.0, 2))
library(foreign)
hereis <- read.xport("C:/R/ { my exported file sas }")
The dimension of { my exported file sas } is 120 mega
Is correct to allocate all the file in memory in a variable ( hereis ) ?
With an alfa ( the moltiplicator ) of 2 , I have the following errors:
Error: cannot allocate vector of size 214 Kb
In addition: Warning message:
Reached total allocation of 446Mb: see help(memory.size)
How can I solve this problem?
Is R-language able to manage data of 100-150 Mb ? And in which conditions?
I'm looking for informations about the use of R and some specific problems.
I'm looking for example (Code of) of complex program in R.
The program I must build could be described by the following steps:
1) open a dataset sas of 120 mega
2) merge it with a weighting universe little dataset
3) calculate a survey index
4) store it in a file
I'm looking also for some developers/users in R language.
Thank you for any advice.
Yours faithfully
Diego Moretti
--
============================================================
Diego Moretti (dimorett at istat.it)
Italian National Statistical Institute (ISTAT)
More information about the R-help
mailing list