[R] read.table & readLines behaviour?

J.delasHeras at ed.ac.uk J.delasHeras at ed.ac.uk
Wed Sep 24 12:24:05 CEST 2008

Quoting Peter Dalgaard <P.Dalgaard at biostat.ku.dk>:

> J.delasHeras at ed.ac.uk wrote:
>> Hi,
>> I have been using 'read.table' regularly to read tab-delimited text
>> files with data. No problem, until now.
>> Now I have a file that appeared to have read fine, and the data inside
>> looks correct (structure etc), except I only had 15000+ rows out of
>> the expected 24000. Using 'readLines' instead, and breaking up the
>> data by tabs, gives me the expected result.
>> I do not understand why this is happening and I can't find anything
>> obvious in the data to explain the bahaviour...
>> Does anybody have an explanation? something to watch out for?
> Hmm:
> - completely blank lines
> - filling
> - quotes
> My bet would be on the last one. Does read.delim work better?

I just tried 'read.delim', and it reads the file just fine

> xxx<-read.delim("All_norm_calls.txt", header=T, sep="\t")
> dim(xxx)
[1] 24000    11

I'll check for quotes etc. Thanks!

> Also, just in case: Check length(probesets) after the readLines call.

I did the first time. It gives me the expected 20001 lines (the first  
one is the header)


Dr. Jose I. de las Heras                      Email: J.delasHeras at ed.ac.uk
The Wellcome Trust Centre for Cell Biology    Phone: +44 (0)131 6513374
Institute for Cell & Molecular Biology        Fax:   +44 (0)131 6507360
Swann Building, Mayfield Road
University of Edinburgh
Edinburgh EH9 3JR

The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.

More information about the R-help mailing list