[R] R and file size
Patrick Connolly
p.connolly at hortresearch.co.nz
Mon Jan 13 01:15:03 CET 2003
On Tue, 07-Jan-2003 at 10:52AM +0000, ripley at stats.ox.ac.uk wrote:
|> On Mon, 6 Jan 2003, Greg Blevins wrote:
|>
|> > I will be involved with an analysis based on a file that will be roughly 25 meg. Assuming I have enough memory, is their any limitations to using R on a file this large.
|>
|> That's a small file!
|>
|> Seriously, people work on datasets of 100Mb or so in 1Gb (or even 512Mb)
|> machines. However, some care is needed to select a good way to read the
|> data in (if read.table, do follow the advice on the help page), and it
My attention was drawn to said help page and noticed something rather
odd.
Usage:
read.table(file, header = FALSE, sep = "", quote = "\"'", dec = ".",
row.names, col.names, as.is = FALSE, na.strings = "NA",
colClasses = NA, nrows = -1,
skip = 0, check.names = TRUE, fill = !blank.lines.skip,
strip.white = FALSE, blank.lines.skip = TRUE,
comment.char = "#")
However, when I check out the function itself:
> args(read.table)
function (file, header = FALSE, sep = "", quote = "\"'", dec = ".",
row.names, col.names, as.is = FALSE, na.strings = "NA", skip = 0,
check.names = TRUE, fill = !blank.lines.skip, strip.white = FALSE,
blank.lines.skip = TRUE)
Consequently, I can't set nrows to see if it would be an improvement.
> version
_
platform i686-pc-linux-gnu
arch i686
os linux-gnu
system i686, linux-gnu
status
major 1
minor 6.1
year 2002
month 11
day 01
language R
Something simple, no doubt, but nothing I can see.
Ideas welcome.
--
Patrick Connolly
HortResearch
Mt Albert
Auckland
New Zealand
Ph: +64-9 815 4200 x 7188
~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~
I have the world`s largest collection of seashells. I keep it on all
the beaches of the world ... Perhaps you`ve seen it. ---Steven Wright
~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~
More information about the R-help
mailing list