[R-sig-Geo] problem reading large ncdf on windows 7 64 bits

Pascal Oettli kridox at ymail.com
Wed Feb 13 07:42:31 CET 2013


Hello,

Which version of NetCDF binaries are you using?

Regards,
Pascal


Le 13/02/2013 10:21, Teurlai Magali a écrit :
> Hi all,
>
> After reading this help post:
> https://stat.ethz.ch/pipermail/r-help/2011-November/297141.html
>
> I think I have the
>
> "issue in the Windows 64-bit NetCDF external dependency"
>
> and would like to know whether it is possible to fix it, or whether I
> should change computer to solve my problem (which could take some time,
> given I have a lot of files to transfer and process).
>
> Here is my problem :
> - I have to read a large ncdf file (3.8 gigas)
> - I have 20 variables in it, each variable's dimension is  250 (lon ) * 270 (lat) * 730 (time)
> - My computer uses windows 7 64 bit
> - I have installed R version 2.15.2 (64 bits)
>
> - I am using the ncdf library, the open.ncdf function works fine.
> - I use function get.var.ncdf to retrieve the variables.
>
> - when I read the ncdf one time step at a time, it reads ok  from time steps 1 to 418
> - R crashes whenever I try to read any of the variables for a time step
>   >= 419 (no matter how much I "zoom" the spatial dimensions, and even
>   if I try to read only one time step over a small spatial extent)
> with the following message :
>
> "Microsoft Visual C++ Runtime Library:
> This application has requested the Runtime to terminate it in an unusual way.
> Please contact the application's support team for more information"
>
> On another computer (runing linux) :
> - ferret is able to retrieve an entire variable
> - matlab is able to retrieve an entire variable
> - R is able to retrieve an entire variable (not a problem of memory)
> - if we cut the ncdf to include only the 418 first time steps, the new ncdf is then... 2.0 gigas
>
> On my computer:
> - if I use a ncdf that has been "cut" spatially on another computer, I
> can extract any entire variable, over the 730 time steps (the new ncdf is then 1.8 Gigas instead of
> 3.8 gigas)
> - if I use a ncdf that keeps all the spatial information, and all the
> time steps, but with only one variable in it, I can read this entire
> variable (the new ncdf is then 800 megas)
>
> So it looks as if R were having trouble extracting variables from ncdf files of more than 2 gigas when runing under windows.
>
> - Has anyone got any explanation for this (I thought the big advantage of working wth ncdf files was the possibility to handle huge files, because R does not have to read the entire file, to read time step 419, is R putting all the previous information into memory ?)
>
> - Is there a way to fix this ? (or should I process all my files on another computer ?)
>
> Thank you in advance for all your answers.
>
> Magali
>
>
>
>
>
> _______________________________________________
> R-sig-Geo mailing list
> R-sig-Geo at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>



More information about the R-sig-Geo mailing list