[R] problem
David Winsemius
dwinsemius at comcast.net
Thu Mar 6 06:00:17 CET 2008
Philipp Pagel <p.pagel at wzw.tum.de> wrote in
news:20080305120637.GA8181 at localhost:
> On Wed, Mar 05, 2008 at 12:32:19PM +0100, Erika Frigo wrote:
>> My file has not only more than a million values, but more than a
>> million rows and moreless 30 columns (it is a productive dataset
>> for cows), infact with read.table i'm not able to import it.
>> It is an xls file.
There is something very wrong here. Even the most recent versions of
Excel cannot handle files with a million rows. Heck, they can't even
handle files with one-tenth than number. In earlier versions the limit
was on the order of 36K.
--
David Winsemius
>
> read.table() expects clear text -- e.g. csv or tab separated in the
> case of read.delim(). If your file is in xls format the simplest
> option would be to export the data to CSV format from Excel.
>
> If for some reason that is not an option please have a look at the
> "R Data Import/Export" manual.
>
> Of course neither will solve the problem of not enough memory if
> your file is simply too large. In that case you will may want to put
> your data into a database and have R connect to it and retrieve the
> data in smaller chunks as required.
>
> cu
> Philipp
>
More information about the R-help
mailing list