[R] Issue with length limit in write.table

alexia a.gaudeul at gmail.com
Wed Mar 3 17:50:56 CET 2010


Hi,

I have an issue with the write.table command:

I have a dataset, with many rows and 3 columns. I give a row example:

"alexia"	"roger","delphine"	"roger","bruno","sandra"

I fist process the data to be able to process the column entries as vectors:

mo<-readLines("c:\\data.txt",n=-1)
ms<-sapply(1:150,function(i) strsplit(mo[i],"\t"))
texts1<-unlist(lapply(1:150,function(i) ms[[i]][c(1)]))
texts2<-unlist(lapply(1:150,function(i) ms[[i]][c(2)]))
texts3<-unlist(lapply(1:150,function(i) ms[[i]][c(3)]))
texts<-cbind(texts1,texts2,texts3)
t<-matrix(texts,ncol=3)
y<-matrix(lapply(parse(text=paste("c(", t, ")")), eval), ncol=ncol(t))

Up to then, everything is fine. Then I compare column vectors to compute
common elements:

z <- cbind(y, "A-B"=apply(y, 1, function(ab) setdiff(ab[[2]], ab[[3]])))
a <- cbind(z, "A-B"=apply(z, 1, function(ab) setdiff(ab[[3]], ab[[2]])))
b <- cbind(a, "A-B"=apply(a, 1, function(ab) intersect(ab[[3]], ab[[2]])))
c<-lapply(b[,4],length)
d<-lapply(b[,5],length)
e<-lapply(b[,6],length)
f <- cbind(b, c, d, e)

Up to now, no problem.

It is when I write the following command that things go wrong:

write.table(f,"c:\\dataprocessed.txt", sep="\t")

At this point, columns entries are truncated at the 35th element. This means
that if I have 50 names in column 2 of row 36, then the .txt file only gives
the first 35.

Is there a way to solve this?

Thanks,

Alexia
-- 
View this message in context: http://n4.nabble.com/Issue-with-length-limit-in-write-table-tp1576863p1576863.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list