[R] read.table

John Maindonald john.maindonald at anu.edu.au
Sat Feb 26 21:45:14 CET 2005


In addition to other suggestions made, note also count.fields().

 > cat("10 9 17  # First of 7 lines", "11 13 1 6", "9 14 16",
+     "12 15 14", "8 15 15", "9 13 12", "7 14 18",
+     file="oneBadRow.txt", sep="\n")
 > nfields <- count.fields("oneBadRow.txt")
 > nfields
[1] 3 4 3 3 3 3 3
 > table(nfields)     ## Use with many records
nfields
3 4
6 1
 > tab <- table(nfields)
 > (1:length(nfields))[nfields == 4]
[1] 2
 > readLines("oneBadRow.txt", n=-1)[2]
[1] "11 13 1 6"

Note the various option settings for count.fields()

John Maindonald             email: john.maindonald at anu.edu.au
phone : +61 2 (6125)3473    fax  : +61 2(6125)5549
Centre for Bioinformation Science, Room 1194,
John Dedman Mathematical Sciences Building (Building 27)
Australian National University, Canberra ACT 0200.


On 26 Feb 2005, at 10:03 PM, r-help-request at stat.math.ethz.ch wrote:

> From: Sean Davis <sdavis2 at mail.nih.gov>
> Date: 26 February 2005 7:11:48 AM
> To: r-help <r-help at stat.math.ethz.ch>
> Subject: [R] read.table
>
>
> I have a commonly recurring problem and wondered if folks would share 
> tips.  I routinely get tab-delimited text files that I need to read 
> in.  In very many cases, I get:
>
> > a <- read.table('junk.txt.txt',header=T,skip=10,sep="\t")
> Error in scan(file = file, what = what, sep = sep, quote = quote, dec 
> = dec,  :
> 	line 67 did not have 88 elements
>
> I am typically able to go through the file and find a single quote or 
> something like that causing the problem, but with a recent set of 
> files, I haven't been able to find such an issue.  What can I do to 
> get around this problem?  I can use perl, also....
>
> Thanks,
> Sean




More information about the R-help mailing list