[R] memory, i am getting mad in reading climate data

Roy Mendelssohn roy.mendelssohn at noaa.gov
Sat Mar 17 21:53:12 CET 2012


Hi All:


> Every system has limits.  If you have lots of money, then invest in a
> 64-bit system with 100GB of real memory and you probably won't hit its
> limits for a while.  Otherwise, look at taking incremental steps and
> possibly determining if you can partition the data.  You might
> consider a relational database to sotre the data so that it is easier
> to select a subset of data to process.

netcdf has some very simple mechanisms for reading in part of the data that are well implemented in the ncdf and ncdf4 packages.    And it does so very quickly.  

Try:

?get.var.ncdf

which will explain how to do so.

-Roy M.


> 
> 
> 
> 2012/3/17 Uwe Ligges <ligges at statistik.tu-dortmund.de>:
>> 
>> 
>> On 17.03.2012 19:27, David Winsemius wrote:
>>> 
>>> 
>>> On Mar 17, 2012, at 10:33 AM, Amen wrote:
>>> 
>>>> I faced this problem when typing:
>>>> 
>>>> temperature <- get.var.ncdf( ex.nc, 'Temperature' )
>>>> 
>>>> *unable to allocate a vector of size 2.8 GB*
>>> 
>>> 

**********************
"The contents of this message do not reflect any position of the U.S. Government or NOAA."
**********************
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
1352 Lighthouse Avenue
Pacific Grove, CA 93950-2097

e-mail: Roy.Mendelssohn at noaa.gov (Note new e-mail address)
voice: (831)-648-9029
fax: (831)-648-8440
www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.



More information about the R-help mailing list