[R] importing files, columns "invade next column"
Marc Schwartz
MSchwartz at MedAnalytics.com
Wed Jan 19 06:42:23 CET 2005
On Wed, 2005-01-19 at 04:25 +0000, Tiago R Magalhaes wrote:
> Dear R-listers:
>
> I want to import a reasonably big file into a table. (15797 x 257
> columns). The file is tab delimited with NA in every empty space.
Tiago,
Have you tried to use read.table() explicitly defining the field
delimiting character as a tab to see if that changes anything?
Try the following:
AllFBImpFields <- read.table('AllFBAllFieldsNAShorter.txt',
header = TRUE,
row.names=paste('a',1:15797, sep=''),
as.is = TRUE,
sep = "\t")
I added the 'sep = "\t"' argument at the end.
Also, leave out the 'fill = TRUE', which can cause problems. You do not
need this unless your source file has a varying number of fields per
line.
Note that you do not need to specify the 'nrows' argument unless you
generally want something less than all of the rows. Using the
combination of 'skip' and 'nrows', you can read a subset of rows from
the middle of the input file.
See if that helps. Usually when there are column alignment problems, it
is because the rows are not being consistently parsed into fields, which
is frequently the result of not having the proper delimiting character
specified.
The last thought is to be sure that a '#' is not in your data file. This
is interpreted as a comment character by default, which means that
anything after it on a row will be ignored.
HTH,
Marc Schwartz
More information about the R-help
mailing list