[R] problems in read.table

tkobayas at indiana.edu tkobayas at indiana.edu
Thu Sep 6 20:29:53 CEST 2007


Dear R-users,

I have encountered the following problem every now and then. But I was 
dealing with a very small dataset before, so it wasn't a problem (I 
just edited the dataset in Openoffice speadsheet). This time I have to 
deal with many large datasets containing commuting flow data. I 
appreciate if anyone could give me a hint or clue to get out of this 
problem.

I have a .dat file called "1081.dat": 1001 means Birmingham, AL.

I imported this .dat file using read.table like
tmp<-read.table('CTPP3_ANSI/MPO3441_ctpp3_sumlv944.dat',header=T)

Then I got this error message:
Error in scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings,  :
        line 9499 did not have 209 elements

Since I got an error message saying other rows did not have 209 
elements, I added skip=c(205,9499,9294)) in hoping that R would take 
care of this problem. But I got a similar error message:
tmp<-read.table('CTPP3_ANSI/MPO3441_ctpp3_sumlv944.dat',header=T,skip=c(205,9499,9294))
Error in scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings,  :
        line 9294 did not have 209 elements
In addition: Warning message:
the condition has length > 1 and only the first element will be used 
in: if (skip > 0) readLines(file, skip)

Is there any way to let a R code to automatically skip problematic 
rows? Thank you very much!

Taka



More information about the R-help mailing list