[R] Problem reading large tables
Duncan Murdoch
dmurdoch at pair.com
Tue Jan 6 20:43:22 CET 2004
On Tue, 6 Jan 2004 14:03:47 -0500, Daniel Sumers Myers
<dmyers at umiacs.umd.edu> wrote :
>Hi,
> I'm trying to read in a fairly large (92 observations by 3680 variables)
>table into R from a space-delimited text file (attached) using the command: d8
><- read.table('d8.r', header=T). The function call runs to completion, and I
>get back a valid table object. However, starting at column 999, the table
>records the value TRUE when it should record T (T's in columns 998 and earlier
>are fine). I've looked at the data file, and I can see no difference between
>(e.g.) the T at position 998 in row 1 and the T in position 999 in row 1, yet
>998 is recorded as T and 999 as TRUE.
The special-looking value 999 is probably just a coincidence. Likely
what happened is that column 999 was the first column that looked to
the type.convert function like a purely logical column (because all
values are T there?). You can tell R not to automatically convert
values by using the colClasses argument to read.table, e.g. colClasses
= "character" forces everything to stay as a character.
Duncan Murdoch
P.S. You can't send attachments to the mailing list, so I didn't see
your data file.
More information about the R-help
mailing list