[R] read.table() and precision?

Moshe Olshansky m_olshansky at yahoo.com
Tue Dec 18 08:02:03 CET 2007


Dear List,

Following the below question I have a question of my
own:
Suppose that I have large matrices which are produced
sequentially and must be used sequentially in the
reverse order. I do not have enough memory to store
them and so I would like to write them to disk and
then read them. This raises two questions:
1) what is the fastest (and the most economic
space-wise) way to do this?
2) functions like write, write.table, etc. write the
data the way it is printed and this may result in a
loss of accuracy. Is there any way to prevent this,
except for setting the "digits" option to a higher
value or using format prior to writing the data? Is it
possible to write binary files (similar to Fortran)?

Any suggestion will be greatly appreciated.

--- Wojciech Gryc <wojciech at gmail.com> wrote:

> Hi,
> 
> I'm currently working with data that has values as
> large as 99,000,000
> but is accurate to 6 decimal places. Unfortunately,
> when I load the
> data using read.table(), it rounds everything to the
> nearest integer.
> Is there any way for me to preserve the information
> or work with
> arbitrarily large floating point numbers?
> 
> Thank you,
> Wojciech
> 
> -- 
> 
> Five Minutes to Midnight:
> Youth on human rights and current affairs
> http://www.fiveminutestomidnight.org/
> 
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained,
> reproducible code.
>



More information about the R-help mailing list