[R] memory error with 64-bit R in linux
James MacDonald
jmacdon at med.umich.edu
Thu Jul 19 03:51:01 CEST 2007
The dist object for the rows of the matrix will be 16000x16000, which if
there are any copies will easily suck up all of your RAM.
A more pertinent question is what use would a heatmap of that size be?
How do you plan to visualize 16000 rows? In a pdf? You certainly
couldn't publish such a thing, nor would it be useful as a picture in a
presentation.
You would probably be better off filtering down to a more reasonable
number of rows (say 500 or less), and using that to make your heatmap.
Best,
Jim
jim holtman wrote:
> Are you paging? That might explain the long run times. How much space
> are your other objects taking up? The matrix by itself should only
> require about 13MB if it is numeric. I would guess it is some of the
> other objects that you have in your working space. Put some gc() in
> your loop to see how much space is being used. Run it with a subset
> of the data and see how long it takes. This might give you an
> estimate of the time, and space, that might be needed for the entire
> dataset.
>
> Do a 'ps' to see how much memory your process is using. Do one every
> couple of minutes to see if it is growing. You can alway use Rprof()
> to get an idea of where time is being spent (use it on a small
> subset).
>
> On 7/18/07, zhihua li <lzhtom at hotmail.com> wrote:
>> Hi netters,
>>
>> I'm using the 64-bit R-2.5.0 on a x86-64 cpu, with an RAM of 2 GB. The
>> operating system is SUSE 10.
>> The system information is:
>> -uname -a
>> Linux someone 2.6.13-15.15-smp #1 SMP Mon Feb 26 14:11:33 UTC 2007 x86_64
>> x86_64 x86_64 GNU/Linux
>>
>> I used heatmap to process a matrix of the dim [16000,100]. After 3 hours
>> of desperating waiting, R told me:
>> cannot allocate vector of size 896 MB.
>>
>> I know the matrix is very big, but since I have 2 GB of RAM and in a
>> 64-bit
>> system, there should be no problem to deal with a vector smaller than
>> 1 GB?
>> (I was not running any other applications in my system)
>>
>> Does anyone know what's going on? Is there a hardware limit where I have
>> to add more RAM, or is there some way to resolve it softwarely? Also
>> is it
>> possible to speed up the computing (I don't wanna wait another 3 hours to
>> know I get another error message)
>>
>> Thank you in advance!
>>
>> _________________________________________________________________
>> ?????????????? MSN Hotmail? http://www.hotmail.com
>>
>>
>> ______________________________________________
>> R-help at stat.math.ethz.ch mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>>
>
>
>
> ------------------------------------------------------------------------
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
--
James W. MacDonald, MS
Biostatistician
UMCCC cDNA and Affymetrix Core
University of Michigan
1500 E Medical Center Drive
7410 CCGC
Ann Arbor MI 48109
734-647-5623
**********************************************************
Electronic Mail is not secure, may not be read every day, and should not be used for urgent or sensitive issues.
More information about the R-help
mailing list