[R] reducing memory usage WAS: Problems with R memory usage on Linux
B. Bogart
ben at ekran.org
Wed Oct 15 23:13:46 CEST 2008
Hello,
I have read the R memory pages.
I realized after my post that I would not have enough memory to
accomplish this task.
The command I'm using to convert the list into a data-frame is as such:
som <- do.call("rbind", somlist)
Where som is the dataframe resulting from combining all the dataframes
in somlist.
Is there a way I can remove each item from the list and gc() once it has
been collected in the som data frame? That way the memory usage should
be able the same, rather than double or triple?
Any other suggestions on reducing memory usage? (I'm already running
blackbox and a single terminal to do the job)
I do have enough memory to store the somlist twice over, but the do.call
bails before its done, so I suppose it uses a workspace so that I need
2x the space of the somlist to collect it?
Is there another function that does the same thing but only uses 2x the
size of somlist of memory?
Thanks for your help,
Prof Brian Ripley wrote:
> Or ?"Memory-limits" (and the posting guide of course).
>
> On Wed, 15 Oct 2008, Prof Brian Ripley wrote:
>
>> See ?"Memory-size"
>>
>> On Wed, 15 Oct 2008, B. Bogart wrote:
>>
>>> Hello all,
>>>
>>> I'm working with a large data-set, and upgraded my RAM to 4GB to help
>>> with the mem use.
>>>
>>> I've got a 32bit kernel with 64GB memory support compiled in.
>>>
>>> gnome-system-monitor and free both show the full 4GB as being available.
>>>
>>> In R I was doing some processing and I got the following message (when
>>> collecting 100 307200*8 dataframes into a single data-frame (for
>>> plotting):
>>>
>>> Error: cannot allocate vector of size 2.3 Mb
>>>
>>> So I checked the R memory usage:
>>>
>>> $ ps -C R -o size
>>> SZ
>>> 3102548
>>>
>>> I tried removing some objects and running gc() R then shows much less
>>> memory being used:
>>>
>>> $ ps -C R -o size
>>> SZ
>>> 2732124
>>>
>>> Which should give me an extra 300MB in R.
>>>
>>> I still get the same error about R being unable to allocate another
>>> 2.3MB.
>>>
>>> I deleted well over 2.3MB of objects...
>>>
>>> Any suggestions as to get around this?
>>>
>>> Is the only way to use all 4GB in R to use a 64bit kernel?
>>>
>>> Thanks all,
>>> B. Bogart
>>>
>>> ______________________________________________
>>> R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>>
>>
>> --
>> Brian D. Ripley, ripley at stats.ox.ac.uk
>> Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
>> University of Oxford, Tel: +44 1865 272861 (self)
>> 1 South Parks Road, +44 1865 272866 (PA)
>> Oxford OX1 3TG, UK Fax: +44 1865 272595
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
More information about the R-help
mailing list