[R] Handling large dataset & dataframe
roger koenker
rkoenker at uiuc.edu
Mon Apr 24 19:51:14 CEST 2006
You can read chunks of it at a time and store it in sparse matrix
form using the packages SparseM or Matrix, but then you need
to think about what you want to do with it.... least squares sorts
of things are ok, but other options are somewhat limited...
url: www.econ.uiuc.edu/~roger Roger Koenker
email rkoenker at uiuc.edu Department of Economics
vox: 217-333-4558 University of Illinois
fax: 217-244-6678 Champaign, IL 61820
On Apr 24, 2006, at 12:41 PM, Sachin J wrote:
> Hi,
>
> I have a dataset consisting of 350,000 rows and 266 columns. Out
> of 266 columns 250 are dummy variable columns. I am trying to read
> this data set into R dataframe object but unable to do it due to
> memory size limitations (object size created is too large to handle
> in R). Is there a way to handle such a large dataset in R.
>
> My PC has 1GB of RAM, and 55 GB harddisk space running windows XP.
>
> Any pointers would be of great help.
>
> TIA
> Sachin
>
>
> ---------------------------------
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-
> guide.html
More information about the R-help
mailing list